Extropic, a new company, is developing a full-stack hardware platform that uses matter’s natural fluctuations as a computational resource for Generative AI. This technology could extend hardware scaling beyond the constraints of digital computing and enable AI accelerators that are faster and more energy-efficient than digital processors.
The company’s founders, Gill and Trev, believe this could unlock powerful probabilistic AI algorithms that are not feasible on digital processors. Extropic’s first processors are nano-fabricated from aluminum and run at low temperatures where they are superconducting. The company is also developing semiconductor devices that operate at room temperature.
Extropic’s Novel Approach to AI Hardware
Extropic, a new company in the AI industry, has developed a unique hardware platform that leverages the natural fluctuations of matter as a computational resource for Generative AI. This innovative approach to computing has the potential to extend hardware scaling beyond the limitations of digital computing, enabling AI accelerators that are significantly faster and more energy-efficient than digital processors. This technology also has the potential to unlock powerful probabilistic AI algorithms that are not feasible on digital processors.
The Challenge of Moore’s Law and the Promise of Biology
The demand for computing power in the AI era is increasing at an unprecedented rate. For the past several decades, the miniaturization of CMOS transistor technology following Moore’s law has allowed much of this exponential growth to be accounted for by increasing computer efficiency. However, Moore’s law is starting to slow down due to fundamental physics: transistors are approaching the atomic scale where effects like thermal noise start to interfere with rigid digital operation.
In contrast, biological systems, which are neither rigid nor digital, host computing circuitry that is much more efficient than anything humanity has built to date. Inter-cellular chemical reaction networks drive computation in biological systems. These systems are intrinsically random and discrete, suggesting that there is no fundamental reason for the constraints of digital logic to bind the efficiency of computing devices. The challenge lies in designing a complete AI hardware and software system that thrives in an intrinsically noisy environment.

Energy-Based Models and the Power of Noise
Energy-Based Models (EBMs) offer a potential solution to this challenge. EBMs appear both in thermodynamic physics and in fundamental probabilistic machine learning. In physics, they are known as parameterized thermal states, arising from steady-states of systems with tunable parameters. In machine learning, they are known as exponential families.
Exponential families are known to be the optimal way to parameterize probability distributions, requiring the minimal amount of data to uniquely determine their parameters. They are excellent in the low-data regime, which encompasses scenarios where one needs to model tail events in mission-critical applications. The way they achieve this is by filling the blanks in data with noise; they seek to maximize their entropy while matching the statistics of the target distribution. This process requires the usage of a lot of randomness, both at training and inference time.
Extropic’s Stochastic Analog Circuits
Extropic is leveraging this power of noise and randomness by implementing EBMs directly as parameterized stochastic analog circuits. These Extropic accelerators will achieve many orders of magnitude of improvement over digital computers in terms of both runtime and energy efficiency for algorithms based on sampling from complex landscapes.
The operational principle of Extropic accelerators is analogous to Brownian motion. In Brownian motion, macroscopic but lightweight particles suspended in a fluid experience random forces due to many collisions with microscopic liquid molecules. These collisions lead to the random diffusion of the particles around the vessel. This simple mechanical system is a source of programmable randomness.
Extropic’s Superconducting Chips and Software Layer
Extropic’s first processors are nano-fabricated from aluminum and run at low temperatures where they are superconducting. These neurons exploit the Josephson effect as a source of nonlinearity, which occurs when two superconductors are near one another. This nonlinearity is required for the device to access non-Gaussian probability distributions, which are necessary to model real-world applications with fat tails.
Extropic is also building semiconductor devices that operate at room temperature to extend their reach to a larger market. These devices trade the Josephson junction for the transistor, sacrificing some energy efficiency compared to superconducting devices. However, it allows one to build them using standard manufacturing processes and supply chains, unlocking massive scale.

To support a wide range of hardware substrates, Extropic is also building a software layer that compiles from abstract specifications of EBMs to the relevant hardware control language. This compilation layer is built upon the theoretical framework of factor graphs. Factor graphs specify how large distributions factorize into local chunks. This allows Extropic accelerators to breakdown and run programs that are too big to fit on any given analog core.
External Link: Click Here For More
