In Christopher Nolan’s 2010 masterpiece Inception, the protagonist is a corporate spy who steals information by infiltrating people’s dreams, exploring the different layers of his victims’ subconscious to find a path to the desired information. Deep learning, a specialised form of machine learning, works in a similar way, recreating the complex architectures of our brain’s neural networks to find information hidden in big data.
Here Neil Ballinger, Head of EMEA at EU Automation, explains how it can be used to solve complex manufacturing problems.
Deep learning is a specialised form of machine learning whose algorithms mimic the structure and functions of the human brain. Deep learning algorithms use artificial neural networks organised in several hierarchical layers - up to 150 in some networks - to carry out the process of machine learning, that is the extraction of relevant information from data.
Traditional machine learning algorithms work in a linear way - they are trained to analyse a given quantity of input and output data, known as training data sets, and extract information. Working by analogy, they can extract similar sets of information from new data clusters.
On the other hand, artificial neural networks are built in a similar way to the human brain, with neural nodes connected together to create a net. Because deep learning algorithms have a net-like and hierarchical structure, rather than a linear one, they can extract information from huge quantities of unstructured, unlabelled raw data.
Layer by layer, these algorithms can break down information, filter data and reveal hidden clusters, working their way to the desired information. This enables them to solve problems that traditional machine learning cannot tackle.
Deep learning algorithms need extensive datasets to be trained. Algorithms for image recognition, for example, may need to see millions of images before they learn to recognise a distinctive feature, such as the characteristics of a defective product.
For this reason, deep learning requires very high computational power, relying on fast processors, dedicated graphics cards and ample amounts of computer memory. For manufacturers this means selecting GPUs with at least 8GB of memory, but even this may not be enough and industry will need to develop memory optimised to the parallel processing requirements of deep learning algorithms.
When analysing unstructured data, deep learning algorithms massively outperform humans for speed and accuracy, which means that they can help manufacturers untangle the complex threads of information that may hide in their data. This information can then be used for different purposes, from predictive maintenance, to the elimination of bottlenecks in the production line.
Another common field of application is quality control. Because deep learning algorithms can be so efficient at processing millions of images, they are increasingly used in machine vision applications to flag up defective products. For example, German-Israeli startup, Inspekto used deep learning algorithms to develop the INSPEKTO S70, an Autonomous Machine Vision system that can learn the gold standard of any product in under 30 minutes, and flag up any variation that constitutes a defect.
Deep learning can also be used in supply chain management. Because these algorithms can recognise patterns that hide in big data, they can be applied to a company’s supply chain statistics to revise asset management strategies, reveal unfavourable routes, and even forecast an increase or decrease in sales based on things like the weather.
In Inception, the protagonist gets so tangled up in the complexity of his victims’ subconscious that he needs to spin a top to understand if he’s awake or dreaming - in dreams, the top keeps spinning forever. Manufacturers don’t need this test. The potential of deep learning might seem too good to be true, but it is certainly here to stay.