Hebbian Learning Model Unifies Memory & Perception Research

A collaborative study led by Sebastian Goldt and Mathew E. Diamond at the Scuola Internazionale Superiore di Studi Avanzati (SISSA), with first author Francesca Schönsberg, has established a unified computational model explaining diverse forms of perceptual memory. Published November 17, 2025, in Neuron, the research demonstrates that both contraction toward and repulsion away from past experiences in stimulus perception emerge from a recurrent neural network governed by Hebbian plasticity. This network accurately reproduced experimental data across three paradigms – working, reference, and a novel “one-back” task – without task-specific adjustments, suggesting a single memory mechanism flexibly supports varying cognitive demands.

Unifying Perceptual Memory and Cognitive Demands

Recent research from SISSA has unified models of perceptual memory, demonstrating a single Hebbian learning mechanism underlies diverse cognitive tasks. Researchers combined computational modeling with behavioral data from both humans and rodents, revealing that “cells that fire together, wire together” effectively explains how we store and recall experiences. Crucially, this network reproduced results across working memory, reference memory and a novel ‘one-back’ task – all without requiring task-specific adjustments.

The team identified opposing tendencies in perception: a “contraction” toward past experiences and a “repulsion” away from them. Both effects, surprisingly, emerge naturally from the same recurrent neural network governed by Hebbian plasticity. This suggests our brains aren’t building separate memory systems for each task, but rather reading out information from a unified network in different ways. The network’s adaptability is key to flexible cognition.

This research moves beyond fragmented models by demonstrating a core principle of perceptual memory. By successfully replicating experimental data across multiple paradigms, the SISSA team provides strong evidence for a unified framework. This isn’t just about how memories are stored, but why our recall is often biased – reflecting past experiences while simultaneously adapting to new information. Understanding this mechanism has implications for understanding

Hebbian Plasticity and Neural Network Modeling

Researchers at SISSA have unified models of memory and perception by demonstrating how Hebbian plasticity – the principle of “cells that fire together, wire together” – can explain diverse perceptual memory functions. Their recurrent neural network reproduced experimental data across working memory, reference memory, and a novel “one-back” task without needing task-specific adjustments. This suggests a single underlying mechanism governs how memories are formed and retrieved, challenging previously fragmented approaches to understanding brain function.

The core of the SISSA team’s model lies in capturing perceptual biases – systematic distortions in how the brain perceives reality. Specifically, the network exhibited both “contraction” (pulling current stimuli towards past experiences) and “repulsion” (distancing from them), mirroring observed human and rodent behavior. Crucially, these opposing tendencies arose naturally from the Hebbian learning rule, suggesting these aren’t separate processes but facets of a unified system.

This research moves beyond simply describing memory functions to proposing a mechanistic explanation. By successfully modeling data from multiple paradigms with a single network governed by Hebbian plasticity, the team demonstrates the power of biologically plausible learning rules. This offers a foundational step towards creating more realistic and robust artificial neural networks – and a deeper understanding

Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

IBM Remembers Lou Gerstner, CEO Who Reshaped Company in the 1990s

December 29, 2025
Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

Optical Tweezers Scale to 6,100 Qubits with 99.99% Imaging Survival

December 28, 2025
Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

Rosatom & Moscow State University Develop 72-Qubit Quantum Computer Prototype

December 27, 2025