June 12, 2018
The design and manufacturing process for new parts or products, which continues to be both time-consuming and costly, has come a long way from the days of developing physical models and subsequent testing, but even the use of today’s high-fidelity computer models take a long time due to their complex nature. Now, according to information provided by the U.S. Department of Energy’s (DOE) Argonne National Laboratory (Argonne), scientists and engineers are using cutting-edge machine learning techniques to overcome these process limitations and help organizations reduce design time from months to days and significantly reduce development costs.
With architectural features and tools that support data-centric workloads, Theta, the Argonne Leadership Computing Facility's (ALCF) new Intel-Cray supercomputer, is particularly well suited for research involving data science and machine learning methods. Image courtesy of U.S. Department of Energy.
A type of artificial intelligence that trains computers to discover hidden patterns in data to make novel predictions without being explicitly programmed, machine learning can be applied to manufacturing to quickly identify the best design for a product or the most efficient production process.
The traditional approach to optimizing the design of a new product involves a great deal of experimental testing and the subsequent evaluation of many prototypes. As the volume and complexity of data derived from these tests increase, industry relies more and more on high-fidelity computer models that virtually represent real-world devices and processes.
The models incorporate certain values corresponding to controlled aspects of the manufacturing process and use data drawn from physical experiments to determine how well the set of inputs would achieve the desired outputs, such as efficiency and cost-effectiveness.
While an improvement over costly investments in physical development and testing, Argonne’s solution is to augment high-fidelity modeling with machine learning to dramatically accelerate the process, while maintaining the reliability of the data. A job that might take hours to run using high-fidelity modeling takes milliseconds when augmented by machine learning.
“Machine learning converts the very complex physical processes represented by the virtual model into a compact computational process that can be run in much less time. It’s similar to how biologists study fruit flies instead of humans. The flies share significant characteristics with humans, but they can generate and evolve much faster,” explained Janardhan Kodavasal, a mechanical engineer in Argonne’s Energy Systems division, who heads the initiative.
To start the process, the scientists run several thousand simulations of a high-fidelity model on Argonne’s supercomputer Mira, at the same time and with different inputs. This step generates virtual data that train the machine learning model to find the best input combination. In the next step, scientists use an evolutionary approach to find an optimal design by prioritizing the product’s desired outputs.
With the ideal outputs specified, the model runs a set of designs and chooses the best ones from that generation. Those designs exchange some of their input features and the model is run again. The process repeats until the merit of the design can’t be enhanced any further. Once the machine learning model identifies the optimal inputs, the scientists run the original high-fidelity model with those inputs to verify that it is the ideal set.
“We work with collaborators at different levels of research experience and capabilities,” said Dr. Kodavasal. “We can help companies at any step of the process, whether that is developing the initial high-fidelity model of the system, or just implementing the machine learning aspect.”