Mick Bonner, assistant professor of cognitive science at Johns Hopkins University, and his team have demonstrated that biologically inspired architectural design in artificial intelligence systems can simulate human brain activity even before training. Published in Nature Machine Intelligence, the research challenges conventional AI development, which prioritizes extensive deep learning and massive computational resources—often costing billions of dollars and thousands of megawatts of energy. By modifying three common network designs—transformers, fully connected networks, and convolutional networks—the scientists found that alterations to convolutional neural networks generated activity patterns rivaling those of trained, conventional AI systems, suggesting architectural design is a crucial factor in accelerating learning.
Biologically Inspired AI Architecture
Research from Johns Hopkins University indicates that biologically inspired architecture can give AI systems an advantage even before training begins. Scientists found that modifying AI architectures – specifically transformers, fully connected networks, and convolutional networks – impacted how closely their responses mirrored human and primate brain activity when exposed to images. This challenges the conventional approach of relying on massive datasets and computing power, suggesting architectural design is a critical, often overlooked, component of AI development.
When researchers increased the number of artificial neurons in transformers and fully connected networks, little change was observed in their responses. However, tweaking the architectures of convolutional neural networks yielded activity patterns that closely simulated those in the human brain. These untrained convolutional networks performed comparably to conventionally trained AI, which typically require exposure to millions or billions of images, demonstrating the importance of architectural design over sheer data volume.
The findings suggest that starting with a “right blueprint” – one informed by biological principles – could dramatically accelerate learning in AI systems. The research team is now focused on developing learning algorithms modeled after biology, aiming to create a new deep learning framework. This work supports the idea that evolution may have optimized brain design for efficiency, and that AI can benefit from incorporating these principles.
Comparing AI Network Designs
Researchers at Johns Hopkins University compared three common AI network designs – transformers, fully connected networks, and convolutional networks – to understand how architecture impacts performance. The team modified these blueprints to create numerous artificial neural networks, then tested their responses to images, comparing the activity to human and primate brain patterns. Findings indicated that increasing the number of artificial neurons in transformers and fully connected networks yielded minimal change, while convolutional networks did show altered activity.
Specifically, tweaking the convolutional network architectures allowed researchers to generate activity patterns rivaling those of conventionally trained AI systems. These conventional systems typically require exposure to millions or billions of images. This suggests that architectural design, rather than just extensive training data, plays a significant role in achieving brain-like AI functionality. The research challenges the current approach of massive data input and resource allocation for AI development.
The study’s results imply that a well-designed architectural blueprint can provide an advantageous starting point for AI learning. Researchers believe incorporating insights from biology, alongside optimized architecture, could dramatically accelerate learning processes. The team is now focused on developing learning algorithms modeled after biological systems to inform a new deep learning framework, building on this understanding of the impact of network design.
Evolution may have converged on this design for a good reason. Our work suggests that architectural designs that are more brain-like put the AI systems in a very advantageous starting point.
Mick Bonner
Accelerating AI Learning Through Design
Researchers at Johns Hopkins University found that biologically inspired architectural design can give AI systems a strong starting point before training, potentially accelerating learning. The team challenged the conventional approach of relying on massive datasets and extensive computing power – costing billions and using thousands of megawatts – by focusing on network blueprints. Their work, published in Nature Machine Intelligence, suggests that the way an AI is designed matters significantly, mirroring how humans learn with limited data.
The scientists modified three common AI network designs – transformers, fully connected networks, and convolutional networks – building dozens of unique artificial neural networks. While increasing the number of artificial neurons in transformers and fully connected networks yielded little change, tweaking convolutional networks allowed researchers to generate activity patterns rivaling conventionally untrained AI systems. These networks, even without exposure to millions of images, showed responses similar to human and primate brain activity.
This finding implies that starting with the right architectural blueprint could dramatically accelerate AI learning. According to lead author Mick Bonner, if massive data training was the only crucial factor, architectural modifications alone wouldn’t produce brain-like AI. The team is now developing learning algorithms inspired by biology to potentially inform a new deep learning framework, building on these initial findings about design’s importance.
