Engineers at the University of Rochester are pioneering a new approach to artificial intelligence that could make autonomous systems like drones and self-driving cars significantly more energy efficient. Announced this past Monday, the project seeks to move away from the power-hungry digital computers that currently run AI and instead develop novel analog hardware inspired directly by the human brain’s own visual system. This new method could solve a major bottleneck in autonomous technology, where high performance often comes at the cost of high power consumption.
The Rochester team is breaking from conventional AI by abandoning the standard neural networks that rely on a process called “backpropagation.” While effective, researchers note this mechanism is not how our brains actually learn to perceive the world. Instead, they are turning to a neuroscience-based model called predictive coding. This theory suggests the brain operates on a hierarchical system of prediction and correction; it constantly builds a mental model of its environment and refines that model based on feedback from the senses. As Rochester professor Michael Huang explains, it’s akin to “paraphrasing what you heard…and using their feedback to refine your understanding.”
This ambitious effort, which builds on the University of Rochester’s rich history in computer vision research, is a major collaborative initiative. The Rochester-led team includes researchers from Rice University and UCLA and is backed by a substantial grant from the Defense Advanced Research Projects Agency (DARPA), which will provide up to $7.2 million over the next four and a half years. The goal is to build these biologically inspired predictive coding networks directly onto analog circuits.
While the ultimate goal is to guide complex autonomous systems, the project will begin with a more focused task: developing a prototype that can classify static images. A key aspect of the project is its practicality; the analog system will not use experimental devices but will instead be manufactured with existing, reliable technologies like CMOS. If the team can demonstrate that their analog system approaches the performance of current digital methods, the technology could be scaled up for the demanding, real-time perception tasks required by drones in flight.
Should this brain-inspired approach prove successful, it could represent a fundamental shift in the development of artificial intelligence. By mimicking the brain’s efficient processes rather than just its computational output, this research could pave the way for a new generation of autonomous machines that can operate longer and more effectively. It is a critical step toward creating AI that is not only powerful, but also sustainable.
More information
External Link: Click Here For More
