SITE SEARCH:
video overview
ADS

IIr Associates, Inc.
Publisher of The Virginia Engineer

Print-Publishing Services
Web Site Design-Coding-Hosting
Business Consulting

Phone: (804) 779-3527
sales@iirassoc.com
iirassoc.com

NEWS
New Approach For Artificial Intelligence-based Models Developed
November 5, 2020

They call it artificial intelligence — not because the intelligence is somehow fake. It’s real intelligence, but it’s still made by humans. That means AI — a power tool that can add speed, efficiency, insight and accuracy to a researcher’s work — has many limitations.

Now, researchers at the University of Delaware (UD) and the University of Massachusetts-Amherst have published details of a new approach to artificial intelligence that builds uncertainty, error, physical laws, expert knowledge and missing data into its calculations and leads ultimately to much more trustworthy models. The new method provides guarantees typically lacking from AI models, showing how valuable — or not — the model can be for achieving the desired result.

Joshua Lansford, a doctoral student in UD’s Department of Chemical and Biomolecular Engineering, and Prof. Dion Vlachos, director of UD’s Catalysis Center for Energy Innovation, are co-authors on the paper published recently in the journal Science Advances.

According to information, the new mathematical framework could produce greater efficiency, precision and innovation for computer models used in many fields of research. Such models provide powerful ways to analyze data, study materials and complex interactions and tweak variables in virtual ways instead of in the lab.

The paper describes how the new mathematical framework works in a chemical reaction known as the oxygen reduction reaction, but it is applicable to many kinds of modeling, Lansford said.

“The chemistries and materials we need to make things faster or even make them possible — like fuel cells — are highly complex,” he said. “We need precision…. And if you want to make a more active catalyst, you need to have bounds on your prediction error. By intelligently deciding where to put your efforts, you can tighten the area to explore.

“Uncertainty is accounted for in the design of our model,” Lansford said. “Now it is no longer a deterministic model. It is a probabilistic one.”

With these new mathematical developments in place, the model itself identifies what data are needed to reduce model error, he said. Then a higher level of theory can be used to produce more accurate data or more data can be generated, leading to even smaller error boundaries on the predictions and shrinking the area to explore.

“Those calculations are time-consuming to generate, so we’re often dealing with small datasets — 10-15 data points. That’s where the need comes in to apportion error.”

That’s still not a money-back guarantee that using a specific substance or approach will deliver precisely the product desired. But it is much closer to a guarantee than you could get before.

This new method of model design could greatly enhance work in renewable energy, battery technology, climate change mitigation, drug discovery, astronomy, economics, physics, chemistry and biology, to name just a few examples.


  ------   News Item Archive  -----  
 
 
The Virginia Engineer on facebook
The Virginia Engineer RSS Feed