A Brief History of Analog Computers

A Brief History Of Analog Computers

In the digital era, reverting to analog computing might seem counterintuitive. However, specific computational tasks present challenges that digital computers, bound by their binary nature and Von Neumann architecture, find challenging to manage efficiently. This realization has sparked renewed interest in analog computing, particularly for complex simulations, optimization problems, and real-time processing tasks.

One area where analog computing is returning is neural networks and artificial intelligence (AI). Neural networks involve computations that mimic the human brain’s operations, which are inherently analog, as neurons employ a form of “fuzzy” logic instead of binary. Analog computers can simulate this behavior more naturally and efficiently than digital systems, leading to faster and more energy-efficient neural network computations. Research in utilizing analog circuits for deep learning shows promising results, especially in reducing power consumption, a critical factor in today’s energy-conscious tech environment (Ambrogio, S., et al. (2018). “Equivalent-accuracy accelerated neural-network training using analogue memory”). Here we provide a brief timeline of the history of Anlog Computers, from antiquity to modern day Analog Computing Start-ups.

Ancient Water-Driven Mechanisms (c. 300 BC – 500 AD)

Water-driven devices, regarded as some of the earliest forms of analog computers, were primarily used by ancient civilizations for timekeeping, astronomy, and irrigation purposes. The Greeks developed sophisticated water clocks, known as “clepsydra,” which were later adopted and enhanced by various cultures, including the Egyptians and Chinese. These instruments often employed a steady flow of water to measure elapsed time, correlating physical quantities (water levels) to the passage of time, a fundamental characteristic of analog computation (Krebs, Robert E. (2004). “Water Clocks”. Groundbreaking Scientific Experiments, Inventions, and Discoveries of the Ancient World).

One remarkable artifact, the Antikythera mechanism (c. 200 BC), believed to be an ancient Greek analog computer, exemplifies the astronomical expertise of ancient civilizations. Discovered in a shipwreck off the coast of Antikythera, Greece, this intricate apparatus of gears and dials was used to predict astronomical positions and eclipses for calendrical and astrological purposes, demonstrating a profound understanding of mechanical computing long before the digital age (Freeth, T., et al. (2006). “Decoding the ancient Greek astronomical calculator known as the Antikythera Mechanism”. Nature).

Mechanical Calculating Machines (17th Century – 19th Century)

The era of mechanical calculating machines commenced in the early 17th century with the invention of the slide rule, often considered one of the earliest analog computers. William Oughtred, an English mathematician, created the slide rule circa 1622, enabling users to perform multiplication and division operations by aligning logarithmic scales. This innovation marked a significant leap in computational tools available at the time and remained widely used until the advent of digital electronics in the mid-20th century (Ifrah, Georges (2001). “The Universal History of Computing: From the Abacus to the Quantum Computer”).

The 19th century witnessed the emergence of more complex mechanical computers, notably Charles Babbage’s Analytical Engine, designed in the 1830s. Although never fully built during Babbage’s lifetime, the Analytical Engine was intended to employ a variety of mechanical components, including gears and levers, to perform arithmetic operations. This machine is often revered as a precursor to modern computers, laying foundational concepts for programmable devices. Babbage’s endeavors underscore the significant advancements in mechanical computing technologies and their influential role in shaping contemporary computational systems (Swade, Doron (2000). “The Cogwheel Brain: Charles Babbage and the Quest to Build the First Computer”).

Slide Rules: 17th Century – Mid 20th Century

The slide rule, an early form of an analog computer, was an essential tool for various calculations before the advent of digital computers and calculators. Invented around 1622-1630 by English clergyman William Oughtred, building on the ideas of Scottish mathematician John Napier who discovered logarithms, the slide rule allowed users to perform multiplications and divisions by adding or subtracting logarithmic scales (Cajori, F., 1920, “William Oughtred, a Great Seventeenth-Century Teacher of Mathematics,” The Scientific Monthly).

The slide rule consists of a fixed middle section and a sliding cursor that can align with different scales on the rule, enabling users to perform mathematical computations based on logarithmic relationships. It became an indispensable tool for scientists, engineers, and students, facilitating complex calculations, ranging from simple multiplication and division to advanced engineering computations like root calculations, trigonometry, and logarithms. Its portability and convenience made it a popular choice for calculations on the go.

Despite its widespread use, the slide rule began to decline in the mid-1970s with the advent of affordable electronic calculators, which provided higher accuracy and ease of use. However, the slide rule remains a significant milestone in the history of computing and engineering education, symbolizing an era when precision engineering and human skill combined to solve complex calculations. It’s still used in educational contexts to demonstrate the principles of logarithms and is cherished by enthusiasts and collectors (Klein, A. R., 1975, “Slide Rule Simplified,” Chemical Engineering).

Electronic Analog Computers (20th Century)

The development of electronic analog computers marked a revolutionary shift in computational technology during the 20th century. These systems, which emerged prominently during World War II, utilized continuously variable electrical signals to simulate and analyze real-world phenomena. One notable example is the Differential Analyzer, a mechanical analog computer developed by Vannevar Bush at MIT in the 1930s. This machine was capable of solving differential equations by integration, using wheel-and-disc mechanisms to perform computations (Mindell, David A. (2002). “Between Human and Machine: Feedback, Control, and Computing Before Cybernetics”).

Post-World War II, the relevance and capabilities of electronic analog computers expanded, particularly in scientific, industrial, and military applications. These machines were invaluable in scenarios requiring real-time simulation and control, such as aircraft design and nuclear power plant management. However, by the late 20th century, the rapid advancement of digital computers, offering greater precision and versatility, led to a decline in the use of analog systems. Despite this, the principles underlying analog computation continue to influence contemporary research areas, including neural networks and quantum computing (Analog Computing at the Dawn of the Digital Age: The Work of Tom Osborne. IEEE Annals of the History of Computing, 2019).

Modern-Day Analog Computers: 21st Century

In the 21st century, the digital revolution has overshadowed the use of analog computers for most common computational tasks. However, analog computation is experiencing a resurgence in niche areas due to the advantages analog systems offer over their digital counterparts, such as energy efficiency, real-time computation, and the ability to handle certain types of problems more naturally.

One area where analog computation shines is in neural network simulations and neuromorphic computing. Neuromorphic chips, like those developed by companies such as IBM with their TrueNorth chip or Intel’s Loihi, mimic the brain’s architecture and operate using analog computational principles (Merolla, P. A., et al., 2014, “A million spiking-neuron integrated circuit with a scalable communication network and interface,” Science). These chips can simulate neural networks more efficiently than traditional digital computers because they utilize neuronal computation’s inherent parallelism and analog nature.

There are now many companies taking advantage of the new Analog Computing revolution. Some of those companies are large processor manufacturers such as Intel, others are smaller start-ups, such as BrainChip.