Source: artificial intelligence
Artificial Intelligence For Industrial Applications
Perspectives for the integration of hardware and algorithms.
By Dirk Mayer and Olaf Enge-Rosenblatt
Due to digitalization, modern machines and systems provide massive quantities of data, which form a significant basis for the optimization of production processes, operations and safety.
These data sets, however, grow more and more complex, which renders the simple analysis methods typically used in the past often ineffective. This is one factor driving the rising importance of artificial intelligence (AI) in decision-making processes.
The processing and storage capacities required for training and managing AI systems currently are provided mostly by large commercial cloud operations. These platforms, like the algorithm development itself, tend to be geared toward data analysis in the business-to-customer segment.
However, some obstacles and challenges do stand in the way of sustained and effective implementation of AI solutions:
- Industrial data sets are relatively inhomogeneous and small, meaning the requirements regarding the classification are more complex while the quantity and quality of the training data are rather low, complicating the use of established AI frameworks.
- The ongoing optimization of processes requires iteratively incorporating new data into the AI training at short intervals. Furthermore, the analysis of these must be completed and the results put out with low latency.
- Many industrial companies do not wish to transfer sensitive data, such as from production processes to a commercial cloud, and sometimes this is even prohibited for compliance reasons. Continuous data exchange by industrial systems over the Internet also poses a potential security gap, which could lead to serious consequences in some situations (sabotage, industrial espionage).
For the industrial use of artificial intelligence, it is therefore necessary to consider concepts that include distributed, high-performance hardware alongside adapted algorithms. The local analysis of data (on premise) enables independent operation and establishes inherent data security. Processing as close as possible to the data sources — in industry, these are typically process-integrated sensors — ensures low latency and improves the quality of results in an early analysis phase.
The extensive system development work is only economically viable if the established procedures of system engineering are transferred to cognitive IoT devices in an industry-specific environment.
For efficient, low-risk and time-saving implementation, it is necessary to take a comprehensive approach from the start and to combine the knowledge of industry experts with the expertise of data analysts and hardware developers.
It is important here for the data analysis to be integrated with an understanding of industrial processes in order to deal with the limited amount of data. To broaden the base of data, AI methods, which are trained with data from a digital twin, also can be used. This requires in-depth understanding of the process, as well. Based on these initial considerations, it is possible to derive the key conditions for designing hardware and IT infrastructure, such as the selection of additional measurement variables or adaptation of sample rates of data streams.
Alongside the essential data security requirements, power considerations are important, especially for mobile applications, for the distribution of the analysis results within the local IT infrastructure – from the on premise cloud down to the on-chip integration of algorithms close to the sensor hardware.
Such an approach is agile in the sense that the development is iterative and controls for risks, while validation still always takes place in line with the requirements of the overall system consisting of industrial process, algorithms and hardware. This permits rapid development of high-performance, custom solutions for AI in industrial applications.
—Olaf Enge-Rosenblatt is group manager for computation analytics at Fraunhofer IIS’ Engineering of Adaptive Systems Division.
Dirk Mayer is head of the department for distributed data processing and control at Fraunhofer IIS’ Engineering of Adaptive Systems Division.