Machine Learning Algorithm Promises Efficient Autonomous Technology

Machine Learning Algorithm Promises Efficient Autonomous Technology

Researchers at Ohio State University, led by graduate student Robert Kent, have developed a machine learning algorithm that uses a digital twin model to predict and control the behavior of electronic circuits. The algorithm, compact enough to fit on a computer chip, is expected to improve the efficiency of future autonomous technologies, such as self-driving cars and heart monitors. The team’s digital twin model was more accurate and less computationally complex than previous machine learning-based controllers. The study was published in Nature Communications and was supported by the U.S. Air Force’s Office of Scientific Research.

Machine Learning Algorithms and the Future of Autonomous Systems

A recent study suggests that systems controlled by next-generation computing algorithms could lead to more efficient machine-learning products. Researchers have successfully used machine learning tools to create a digital twin, a virtual copy, of an electronic circuit that exhibits chaotic behavior. This digital twin could predict the circuit’s behavior and use that information to control it.

Linear controllers, which use simple rules to direct a system to a desired value, are commonly used in everyday devices like thermostats and cruise control. However, these algorithms struggle to control systems that display complex behavior, like chaos. Advanced devices like self-driving cars and aircraft often rely on machine learning-based controllers, which use intricate networks to learn the optimal control algorithm needed to operate best. However, these algorithms can be challenging and computationally expensive to implement.

The Impact of Efficient Digital Twins

According to Robert Kent, the study’s lead author and a physics graduate student at Ohio State University, having access to an efficient digital twin could significantly impact the development of future autonomous technologies. Traditional controllers for chaotic systems have been difficult to develop due to their extreme sensitivity to small changes. This is particularly critical in situations where milliseconds can make a difference between life and death, such as when self-driving vehicles must decide to brake to prevent an accident.

The team’s digital twin was built to optimize a controller’s efficiency and performance, resulting in a reduction of power consumption. It was trained using a type of machine-learning approach called reservoir computing. According to Kent, this machine learning architecture is very good at learning the behavior of systems that evolve in time.

Novel Computing Ability and Dynamic Systems

The digital twin’s novel computing ability makes it well-equipped to handle dynamic systems such as self-driving vehicles and heart monitors, which must be able to adapt quickly to a patient’s heartbeat. Big machine learning models have to consume lots of power to crunch data and come up with the right parameters, whereas the digital twin’s model and training are so extremely simple that systems could learn on the fly.

To test this theory, researchers directed their model to complete complex control tasks and compared its results to those from previous control techniques. The study revealed that their approach achieved a higher accuracy at the tasks than its linear counterpart and is significantly less computationally complex than a previous machine learning-based controller.

Efficiency and Environmental Incentives

There is an important economic and environmental incentive for creating more power-friendly algorithms. As society becomes more dependent on computers and AI for nearly all aspects of daily life, demand for data centers is soaring. This has led to concerns over digital systems’ enormous power appetite and what future industries will need to do to keep up with it. Building these data centers and conducting large-scale computing experiments can generate a large carbon footprint, so scientists are looking for ways to curb carbon emissions from this technology.

Future Applications and Potential

Future work will likely be directed towards training the model to explore other applications like quantum information processing. In the meantime, these new elements are expected to reach far into the scientific community. Not enough people know about these types of algorithms in the industry and engineering, and one of the big goals of this project is to get more people to learn about them, according to Kent. This work is a great first step toward reaching that potential. The study was supported by the U.S. Air Force’s Office of Scientific Research.

More information
External Link: Click Here For More