What can we learn about Quantum Computing Companies from technology history?

What Can We Learn About Quantum Computing Companies From Technology History?

History isn’t always a guide to the future, but it can often help us pick up trends and similarities which might help us predict the future. Is there anything we can learn from technology companies of the past, such as some brands that are no longer in business Companies like Sun Microsystems and DEC were computing powerhouses but have faded from view. What happened, and what can we learn?

Some might liken the quantum computing industry to the technology companies of the past and especially some of the companies making processors and chips that have powered the technological revolution. Can quantum companies learn something valuable from the early days of the computer revolution that brought us semiconductor chips, programming languages that are household names, and even the transition from mainframe to desktop computers?

Fairchild Semiconductor

Fairchild Semiconductor was a technology company founded in 1957 by eight engineers who had previously worked at Shockley Semiconductor. William Shockley was famous for getting the 1956 Nobel prize and had an infamous temper which led his employees to look elsewhere for employment. Hence Fairchild was born. The company was known for developing the first commercially successful integrated circuit, revolutionising the computer and electronics industry. Key figures at Fairchild Semiconductor include Robert Noyce, who is often credited as the co-inventor of the microchip, and Gordon Moore, who co-founded Intel and is known for Moore’s Law, which predicted the exponential increase in computer processing power over time.

Fairchild Semiconductor did not fail, but it did face challenges and changes throughout its history. In the 1960s and 1970s, the company faced intense competition from other semiconductor manufacturers and struggled to adapt to changes in the industry. In 1979, Fairchild Semiconductor was acquired by Schlumberger Limited, a French conglomerate, and became a subsidiary. In 1987, the company was sold to National Semiconductor, which rebranded it as Fairchild Semiconductor International. In 2016, Fairchild Semiconductor was acquired by ON Semiconductor, a global semiconductor company. Today, Fairchild Semiconductor continues to operate as a division of ON Semiconductor.

Takeaway Lesson(s): William Shockley, despite his brilliance, couldn’t keep his employees. Even the company his disgruntled employees founded: Fairchild, fell away into obscurity. There is an innovation cost to being first or even second. But if companies don’t invest in people, and allow the right culture to flourish, no amount of innovation can enable them to remain in business. Even publicly listed Quantum Company companies could be market-leading now but fail to catch the eventual big wave that carries them through turbulent times.

Sun Microsystems

Sun Microsystems was a technology company founded back in 1982 and known for its work in computer hardware, software, and network technology. Playing a significant role in developing the internet and was at the forefront of many technological innovations in the computing industry. Sun was behind the popular programming language: Java, which is used now by many companies. Sun made large workstations that were seminal in some early innovations like the render farms used to produce the images for films such as Toy Story on their SPARC Clusters and Workstations.

However, Sun Microsystems faced several challenges and ultimately struggled to remain competitive in the rapidly evolving technology market. One of the main factors contributing to the company’s decline was increased competition from other technology companies, such as IBM, Hewlett-Packard, and Dell. Sun Microsystems faced financial challenges, including rising debt and declining profits. In 2010, Sun Microsystems was acquired by Oracle Corporation.

Takeaway Lesson(s): Failure to adopt open standards, as what happened with the SPARC clusters, meant it was assailed by more open architectures such as x86 used in IBM PC’s according to Enterprise Strategy Group analyst Brian Babineau. Proprietary systems are not the way to go.

DEC (Digital Equipment Corporation)

DEC (Digital Equipment Corporation) was a technology company founded in 1957 and known for its work in computer hardware and software systems. The company was a pioneer in the computer industry and played a significant role in developing the personal computer market with a variety of well-known and well-liked products, such as the PDP range (which was sold from the ’70s through to the ’90s). Although the company never became a household name like IBM.

In 1998, DEC was acquired by Compaq, a major computer company later acquired by Hewlett-Packard (HP). As a result of the acquisition, DEC became a subsidiary of Compaq and was integrated into the parent company. Many of DEC’s products and technologies were absorbed into Compaq’s product portfolio, and the DEC brand was phased out. Today, many of the products and technologies that DEC developed are no longer in use. The company’s legacy lives on in the products and technologies it grew and introduced during its time as a leading player in the computer industry.

There are several reasons why DEC could not sustain its success and eventually failed. One reason was that the company faced intense competition from other computer companies, particularly in the personal computer market, where DEC struggled to keep up with the rapid pace of technological change.

In contrast, DEC was a relatively newer player in the tech industry, and it struggled to keep up with the rapid pace of technological change. The company was thought to have made several strategic mistakes, such as failing to embrace the internet and the emergence of the World Wide Web in the 1990s, which contributed to its decline. Although Microsoft was initially slow to catch on to the WWW, it eventually got with the program.

Takeaway Lesson(s): DEC, according to Clayton Christensen (of the well-known book Innovators Dilemma), could not innovate on price. Whilst IBM were able to create a desktop machine for less than $2,000, Digitial Equipment Corporation could not compete with machines costing more than $50,000.

Quantum Computing Companies should be focused on costs even now and prepare for mainstream quantum computing to ensure price points are possible. Of course, much will come down to the qubit technology, and we are yet to see a winner, but we should not be surprised that it will come down to cost per qubit ($/Q)

The Integrated Circuit was the turning point for Electronics.

The integrated circuit (IC), or the microchip, was invented by Jack Kilby in 1958 while working at Texas Instruments. Kilby’s invention revolutionized the field of electronics by creating a way to shrink electronic circuits onto a small, flat piece of material, making it possible to create much more complex and powerful electronic devices. Devices could be etched into silicon, and this has been the mainstay technology and is today. However, transistors are often just single nanometers across compared to micrometre dimensions of the past.

Before Kilby’s invention, electronic circuits were built using discrete components, such as transistors, resistors, and capacitors, connected using wires. These circuits were large, expensive, and prone to errors. Kilby’s integrated circuit, on the other hand, allows for the creation of much smaller, more reliable, and more cost-effective electronic devices by allowing all of the components to be fabricated together on a single piece of material.

While Kilby’s first microchip had only a single transistor, subsequent microchips developed in the 1960s and 1970s had hundreds or thousands of transistors, which allowed for the creation of much more powerful and complex electronic devices. Today, microchips used in modern electronic devices often have billions of transistors and are capable of performing a wide range of complex tasks and are produced by a variety of companies from AMD, Intel and Texas Instruments.

Kilby’s work on the integrated circuit was recognized with the Nobel Prize in Physics in 2000. Today, integrated circuits are an essential component of many electronic devices, including computers, smartphones, and other electronic devices, and have had a profound impact on modern society.

Processors that followed the 4004 included the 8008 (3100 transistors introduced in 1972) and the 8080, which was introduced in 1974 (with 6500 transistors), and the 8086, which hit the market in 1978. These processors were even more powerful than the 4004 and were used in many computers and other electronic devices.

Will Quantum Computing Companies Explode in number when we get the Integrated Circuit equivalent?

We are currently dealing with single qubits or a handful of qubits. We are now approaching a few hundred qubits. Similarly, are we still in the field of single qubits or meagre numbers of qubits per chip or device?

The Intel 4004 was one of the first microprocessors, which are computer processors that are designed to be used in small electronic devices. The 4004 was developed by Intel Corporation in 1971 and was the first microprocessor produced commercially. The 4004 had a total of 2300 transistors. As yet, there are no commercially produced quantum processors. Whilst numbers of qubits, analogous to transistors, are increasing, no processors are commercially produced that we know of at scale. Perhaps the closest might for the SpinQ devices, but nothing approaching a mass market or commercial scale.

What Can We Learn About Quantum Computing Companies From Technology History?
Image of the Intel 4004 courtesy of the London Science Museum

The 4004 was a very basic microprocessor with a minimal instruction set and relatively low performance compared to modern microprocessors. However, it was a significant step forward in the development of microprocessors. It paved the way for creating more advanced and powerful microprocessors used in computers and other electronic devices today.

Innovation that doesn’t survive

An essential thread we can see is that despite innovation, these companies didn’t have the longevity of companies such as Microsoft, Apple or IBM. What was in the DNA that was so different?

One thesis running through companies such as DEC, Sun, and Fairchild, is that any mass commercial product never reached the end consumer, which might be why it never survived – but it’s more complicated than that. These companies never made it to household names compared to IBM, Apple or Microsoft. We might see the same with Quantum Computing Companies where there is a race to innovate, but the real winners might be those companies who can wrap that innovation into something that more than “just” proves the technology works. Think price points, scale, knowing the applications, customers and more. Could a second wave of companies come that exploit the technology that the original quantum companies produced and leapfrog them, just as Intel did with the likes of Shockley, and Fairchild Semiconductor who arguably invented the silicon chip?

Perhaps IBM’s creation of the Personal Computer was pivotal in cementing its legacy because it aimed to put a computer on every desk. Later on, we had the Home Computer revolution, which aimed to put a computer in every home. There has to be an economy of scale that will permit quantum companies to survive, which means a mass market. At the moment, just like in the early days of computing, products and services are not mass markets. What is in the company of those early pioneers that will enable them to prosper?

Killer Applications

Quantum computing devices are loosely coupled to mean everything from all technology solutions are only in specialist uses. We don’t have the application pressure where there is a demand for massive Quantum workloads. This contrasts with AI workflows that create demand for specialist processors such as GPU’s and even. Just look at the explosion of AI, which uses GPU’s and AI is, of course, becoming a mass market with the likes of chatGPT.

Intel, AMD, NVIDIA and even IBM transitioned from Industrial producer to consumer champion. For example, Intel launched the “Intel Inside” campaign in 1991 to promote its microprocessors in personal computers and other electronic devices to convey the idea that Intel’s microprocessors were a key component of many electronic devices.

Do We Need A Killer Application To Push Quantum To The Next Level? Is The Entire Industry Missing A Push Towards Real Value Creation?
Do we need a Killer Application to push Quantum to the next level? Is the entire industry missing a push towards real value creation?

Do today’s Quantum hardware creators such as D-wave, IonQ, and Xanadu have enough flexibility to survive the pressure of the eventual market needs? Some have stated that quantum computing is waiting for a killer application to drive adoption; therefore, quantum businesses must be able to provide the services that companies need. We believe those businesses that can be vertically integrated are likely to be successful.

We think those quantum computing companies that do the following will have the best chance of success:

  • Vertical integration or full-stack. Those companies that can take real-world problems and solve them by running on their stack will likely convey the most value—consistently looking for ways to solve real-world problems and give an absolute quantum advantage. Who can take a business problem, provide a solution with just a few steps, and run on quantum hardware at a competitive price? Companies must solve consumers’ pain points.
  • Deep push into applications with application frameworks and associated services. Creating the tools that enable quantum workflows. Think languages, programming frameworks, environments, and application frameworks. Think of specialist frameworks for solving problems such as chemistry, optimization, and integrations with business workflows.
  • Culture of taking risks to push towards plugging the gaps. I.e. ensuring they can capture more of the process. i.e. consultancy, education, outreach and more. Specialisation may only get so far; organisations must be dynamic and nimble to ensure they don’t become technology dinosaurs. They must invest, experiment and thoroughly push all.
  • Ecosystems. The successful giants of the past made sure there was an ecosystem. They didn’t look to create a stranglehold. Instead, they looked to build a “flywheel” that continued to spin and accelerate as more users came on board. Think marketplaces, sharing, algorithms, knowledge and experience. Look at how Amazon’s AWS (Amazon Web Service) has shaken up the way that companies deploy digital assets.
  • The realisation that technological innovation isn’t enough. The Quantum Industry needs “killer applications”.The industry needs “real” value to be created. It’s not about the number of qubits but how they can create value for end users. That could be a $/Q (cost per qubit) price strategy, just as today how we buy millions of transistors per $ when we buy a computer with a microprocessor.

IBM moved into personal computing and started a whole new category of products. Apple launched the iPad, again a whole new product category – and don’t forget the iPhone, which was an entirely new product category. Businesses which create innovations, new products and experiments stay alive. Intel survived the push into commoditized hardware by extolling the benefits of its processors to end consumers; whether it works today is another matter, as consumer care more about what they can do with a machine, battery life, aesthetics and the applications they can run.