The principles of Moore’s Law, originating from Gordon Moore, which states that transistor density doubles every two years, have been applied beyond integrated circuits to fields like data storage, flash memory, and energy storage. This has led to exponential improvements in computing power and reductions in cost. The concept has also inspired new approaches to problem-solving in artificial intelligence, materials science, and biotechnology.
Gordon Moore
However, concerns about the reliability and security of modern microprocessors have arisen. Gordon Moore’s prediction has had a profound impact on Silicon Valley, driving innovation and economic growth, but his legacy extends beyond his law to include pioneering work in integrated circuits and philanthropy.
As the digital revolution continues to shape our world, it is worth reflecting on the pioneers who have driven this progress. One such individual is Gordon Moore, a co-founder of Intel Corporation and a visionary whose predictions about the future of computing have had a profound impact on modern society. In 1965, Moore made a bold statement that would come to define the trajectory of technological advancement: he observed that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost.
This prediction, now known as Moore’s Law, has held true for over five decades, driving the rapid development of smaller, faster, and more affordable electronic devices. The implications of this trend have been far-reaching, enabling the widespread adoption of personal computers, mobile phones, and the internet. As a result, our daily lives are increasingly intertwined with digital technologies, from online shopping and social media to telemedicine and remote work. The relentless march of progress has also spawned new industries, such as cloud computing, artificial intelligence, and cybersecurity, which have created millions of jobs and transformed the global economy.
However, as transistors approach the size of individual atoms, the physical limits of Moore’s Law are beginning to emerge. As the industry struggles to maintain the pace of innovation, researchers are exploring new materials, designs, and technologies to sustain the exponential growth in computing power. One promising area of research is quantum computing, which leverages the principles of quantum mechanics to perform calculations that are orders of magnitude faster than classical computers. By harnessing the strange properties of subatomic particles, scientists hope to unlock new possibilities for fields such as cryptography, optimization, and simulation, and ensure that the digital revolution continues to propel human progress forward.
Early Life And Education Of Gordon Moore
Gordon Moore was born on January 3, 1928, in San Francisco, California, to Earl and Florence Moore. His father was a banker, and the family moved frequently during Moore’s childhood, eventually settling in nearby Palo Alto when he was eleven years old.
Moore developed an interest in chemistry at an early age, inspired by his mother’s brother, who was a chemist. He attended Palo Alto High School, where he excelled in science and mathematics, graduating in 1947. Moore then enrolled at the University of California, Berkeley, where he earned his Bachelor of Science degree in Chemistry in 1950.
After completing his undergraduate studies, Moore moved to Massachusetts to pursue his graduate education at the California Institute of Technology (Caltech). He earned his Ph.D. in Physics and Chemistry in 1954 under the supervision of Professor Linus Pauling, a renowned chemist and Nobel laureate. During his time at Caltech, Moore’s research focused on the physical chemistry of semiconductors.
Moore’s academic background laid the foundation for his future work in the development of integrated circuits and microprocessors. His expertise in semiconductor physics enabled him to make significant contributions to the field, ultimately leading to the formulation of Moore’s Law in 1965.
In the early 1950s, Moore worked as a researcher at the National Advisory Committee for Aeronautics (NACA), where he met his future business partner, Robert Noyce. The two men shared an interest in semiconductor technology and its potential applications.
Moore’s education and research experience prepared him to take on leadership roles in the burgeoning field of microelectronics, ultimately co-founding Intel Corporation with Noyce in 1968.
Founding Of Fairchild Semiconductor Corporation
Fairchild Semiconductor Corporation was founded in 1957 by Sherman Fairchild, a pioneer in the development of semiconductor technology. The company’s early success can be attributed to its innovative approach to transistor manufacturing, which involved the use of silicon rather than germanium. This decision proved crucial, as silicon-based transistors were more reliable and had a longer lifespan than their germanium counterparts.
One of Fairchild’s earliest and most significant hires was Gordon Moore, who would later co-found Intel Corporation and formulate Moore’s Law. Moore joined Fairchild in 1957, bringing with him his expertise in chemical engineering and semiconductor manufacturing. His work at Fairchild focused on developing new methods for producing high-quality silicon transistors, which ultimately led to the creation of the first commercially viable integrated circuit.
In the early 1960s, Fairchild Semiconductor Corporation began to gain recognition for its innovative approach to semiconductor design and manufacturing. The company’s focus on research and development led to several breakthroughs, including the introduction of the first planar transistor in 1959. This innovation enabled the mass production of transistors with higher yields and greater reliability.
Gordon Moore’s work at Fairchild played a significant role in the development of the microelectronics industry. His research focused on improving the performance and reducing the cost of semiconductor manufacturing, which ultimately led to the formulation of Moore’s Law. This law, first proposed in 1965, states that the number of transistors on a microchip doubles approximately every two years, leading to exponential increases in computing power and reductions in cost.
Fairchild Semiconductor Corporation continued to innovate throughout the 1960s and 1970s, with its researchers making significant contributions to the development of metal-oxide-semiconductor (MOS) technology. This innovation enabled the creation of smaller, faster, and more efficient transistors, which ultimately led to the development of modern microprocessors.
The legacy of Fairchild Semiconductor Corporation can be seen in the many companies that have been founded by its alumni, including Intel Corporation, Advanced Micro Devices (AMD), and National Semiconductor. The company’s focus on innovation and research has had a lasting impact on the development of the microelectronics industry.
Development Of First Commercial Integrated Circuit
The development of the first commercial integrated circuit (IC) is attributed to Jack Kilby, an American electrical engineer who worked at Texas Instruments in the late 1950s. In 1958, Kilby developed a semiconductor device that combined multiple electronic components, such as transistors, diodes, and resistors, onto a single piece of semiconductor material, typically silicon.
Kilby’s innovation was the use of a ceramic substrate with gold wires to connect the various components, which led to the creation of the first hybrid integrated circuit. This breakthrough invention earned Kilby the Nobel Prize in Physics in 2000. The first commercial IC, however, was not developed until 1961 by Robert Noyce, co-inventor of the IC and co-founder of Fairchild Semiconductor.
Noyce’s design improved upon Kilby’s concept by using a monolithic approach, where all components were fabricated on a single piece of semiconductor material. This led to increased reliability, reduced size, and lower production costs. The first commercial IC, known as the Fairchild Micrologic, was a simple logic device that contained four transistors and five resistors.
The development of the IC revolutionized the electronics industry, enabling the creation of smaller, faster, and more powerful electronic devices. This innovation also paved the way for Gordon Moore’s prediction, later known as Moore’s Law, which states that the number of transistors on a microchip doubles approximately every two years, leading to exponential increases in computing power and reductions in cost.
Moore’s Law has held true for over five decades, driving the rapid advancement of modern electronics and computing. The IC has become a fundamental component in modern technology, from smartphones and laptops to medical devices and spacecraft.
The impact of the IC on society has been profound, enabling widespread access to information, communication, and entertainment. The development of the IC is considered one of the most significant technological advancements of the 20th century, with far-reaching consequences for fields such as computing, medicine, transportation, and beyond.
Prediction Of Transistor Density Doubling Rate
Gordon Moore, co-founder of Intel, observed that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. This observation, known as Moore’s Law, has driven the rapid advancement of electronics and computing technology for over five decades.
The transistor density doubling rate is a critical component of Moore’s Law, as it enables the creation of smaller, faster, and more powerful electronic devices. According to Moore’s original prediction, the number of transistors on a microchip would double approximately every two years, leading to an exponential increase in computing power and reductions in cost.
In 1975, Moore revised his prediction, stating that the transistor density doubling rate would slow down to approximately every 3.5 years. This revision was based on the challenges faced by manufacturers in continuing to shrink transistors while maintaining their performance and reducing costs.
Despite these challenges, the transistor density doubling rate has continued to follow Moore’s Law, with some fluctuations. In the 1990s, the industry experienced a slowdown in the doubling rate, but this was overcome through the development of new manufacturing technologies and materials.
The International Roadmap for Devices and Systems has predicted that the transistor density doubling rate will continue until at least 2030, driven by advances in extreme ultraviolet lithography, nanosheet transistors, and other emerging technologies. However, there are concerns about the physical limits of transistor scaling, which may eventually slow down or even halt the doubling rate.
Researchers have explored alternative technologies to sustain the transistor density doubling rate, including quantum computing, neuromorphic computing, and three-dimensional stacked transistors. These innovations aim to overcome the physical limitations of traditional transistor scaling and ensure continued improvements in computing power and reductions in cost.
Publication Of Cramming More Components Onto Integrated Circuits
Gordon Moore, co-founder of Intel, observed in 1965 that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. This observation, known as Moore’s Law, has driven the development of smaller, faster, and more powerful integrated circuits.
The ability to cram more components onto integrated circuits has been made possible by advances in lithography, the process of creating patterns on silicon wafers. In the 1960s, lithography was limited to features sizes of around 10 micrometers, but by the 1990s, feature sizes had decreased to around 0.5 micrometers. Today, feature sizes are measured in nanometers, with some cutting-edge chips featuring transistors as small as 3 nanometers.
One key innovation that has enabled the continued shrinking of transistors is the development of new materials and technologies for creating the tiny patterns required for modern integrated circuits. For example, extreme ultraviolet lithography uses a wavelength of just 13.5 nanometers to create patterns on silicon wafers, allowing for feature sizes as small as 10 nanometers.
Another important factor in the continued advancement of Moore’s Law has been the development of new transistor designs and architectures. For example, the FinFET design, introduced in the early 2010s, uses a three-dimensional structure to improve performance and reduce power consumption.
The ability to cram more components onto integrated circuits has also been driven by advances in chip design and manufacturing. For example, the use of 3D stacked architectures, where multiple layers of transistors are stacked on top of each other, has allowed for significant increases in computing power while minimizing increases in chip size.
As transistors approach the size of individual atoms, it is becoming increasingly difficult to continue shrinking them further. However, researchers are exploring new technologies and materials that may allow for continued advancements in computing power and reductions in cost, such as quantum computing and graphene-based transistors.
Initial Reception And Skepticism Of Moores Law
Gordon Moore, co-founder of Intel, first proposed his eponymous law in 1965, stating that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. Initially, the idea was met with skepticism by many experts in the field.
One of the earliest criticisms came from Carver Mead, a prominent computer scientist, who argued that Moore’s Law was more of an observation than a fundamental principle, and that it would eventually reach physical limits. Mead’s concerns were rooted in the understanding that as transistors shrink in size, they approach the scale of individual atoms, making further miniaturization increasingly difficult.
Another early skeptic was Robert Noyce, co-inventor of the integrated circuit, who questioned whether the industry could sustain the rapid pace of innovation required to maintain Moore’s Law. Noyce’s doubts were fueled by his experience with the challenges of manufacturing and scaling transistors.
Despite these initial reservations, Moore’s Law gained widespread acceptance as the semiconductor industry continued to deliver on its promises. The law became a guiding principle for the development of new technologies, driving innovation and investment in the field.
In the 1970s and 1980s, researchers like Robert Dennard and his team at IBM demonstrated that scaling transistors was possible while maintaining their performance and reducing power consumption. These advancements further solidified Moore’s Law as a fundamental principle governing the development of microelectronics.
As the industry continued to push the boundaries of transistor miniaturization, concerns about the physical limits of Moore’s Law resurfaced. In 1995, Stanford professor James Meindl predicted that the law would reach its physical limits by around 2010, sparking renewed debate about the sustainability of exponential growth in computing power.
Industry Adoption And Validation Of Moores Law
Gordon Moore, co-founder of Intel, observed in 1965 that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. This observation, known as Moore’s Law, has driven the rapid advancement of the semiconductor industry for over five decades.
The industry adoption of Moore’s Law was swift, with companies like Intel and Texas Instruments incorporating the concept into their research and development strategies. By the 1970s, the law had become a guiding principle for the entire semiconductor industry, driving innovation and investment in new technologies.
The validation of Moore’s Law has been extensively researched and documented. Studies have shown that the law holds true across various types of transistors, including bipolar junction transistors and metal-oxide-semiconductor field-effect transistors. Furthermore, research has demonstrated that the law is not limited to transistor density, but also applies to other metrics such as clock speed and power consumption.
The economic implications of Moore’s Law have been significant, with the semiconductor industry experiencing rapid growth and investment. A study estimated that the industry would reach $1 trillion in revenue by 2025, driven largely by the continued advancement of Moore’s Law. Additionally, the law has enabled the development of new technologies such as smartphones, cloud computing, and artificial intelligence.
Despite concerns about the physical limitations of transistor scaling, researchers have continued to find innovative solutions to extend Moore’s Law. The introduction of new materials like silicon-germanium and III-V semiconductors has enabled further reductions in transistor size. Furthermore, the development of 3D stacked transistors and gate-all-around transistors has provided additional avenues for continued scaling.
The validation of Moore’s Law has also led to increased investment in research and development, with companies like Intel and IBM committing billions of dollars to advancing semiconductor technology. Governments have also invested heavily in initiatives such as the National Nanotechnology Initiative in the United States, which aims to advance nanoscale science and engineering.
Role Of Moores Law In Driving Technological Advancements
Gordon Moore, co-founder of Intel, observed in 1965 that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. This observation, known as Moore’s Law, has driven technological advancements in the field of electronics and computing.
Moore’s Law has enabled the development of smaller, faster, and more powerful electronic devices, which have transformed various aspects of modern life. For instance, the widespread adoption of smartphones has been made possible by the ability to pack millions of transistors onto a single chip, enabling fast processing, high-resolution displays, and advanced camera capabilities.
The relentless pursuit of Moore’s Law has also fueled innovation in fields such as artificial intelligence, data storage, and cloud computing. The exponential growth in computing power has enabled researchers to process vast amounts of data, leading to breakthroughs in areas like machine learning and natural language processing.
Moreover, the economic implications of Moore’s Law have been significant, with the cost of computing power decreasing dramatically over the past several decades. This has led to increased accessibility of technology, enabling businesses and individuals to leverage computing resources at a lower cost, thereby driving productivity and innovation.
The continued advancement of Moore’s Law has also led to the development of new technologies like the Internet of Things (IoT), which relies on the ability to integrate small, low-power microcontrollers into everyday devices. This has opened up new avenues for applications such as smart homes, industrial automation, and wearable devices.
As the industry continues to push the boundaries of Moore’s Law, researchers are exploring new materials and technologies to sustain the exponential growth in computing power. For instance, the development of quantum computing and neuromorphic chips holds promise for further accelerating technological advancements in the years to come.
Economic Implications Of Exponential Growth In Computing Power
The exponential growth in computing power, as described by Gordon Moore in 1965, has had significant economic implications over the past several decades. This growth, popularly known as Moore’s Law, has led to a decrease in the cost of computing power, making it more accessible and affordable for individuals and businesses alike.
One of the primary economic implications of this growth is the increase in productivity. As computers have become faster and more efficient, they have enabled people to perform tasks at a quicker rate, leading to an increase in output per hour worked. The widespread adoption of information technology has contributed to a 10-15% increase in productivity growth rates in developed economies.
The decrease in cost has also led to an increase in demand for computing power, driving economic growth. The semiconductor industry, which is responsible for producing the microchips that power computers, has experienced rapid growth as a result of this increased demand. In fact, the global semiconductor market size was valued at over $500 billion in 2020.
Furthermore, the exponential growth in computing power has enabled the development of new industries and business models. The rise of cloud computing, for example, has led to the creation of new companies such as Amazon Web Services and Microsoft Azure, which provide on-demand access to computing resources. This shift has also led to a change in the way businesses operate, with many adopting a more agile and flexible approach to software development.
The growth in computing power has also had implications for employment. While it has created new job opportunities in fields such as software development and data science, it has also led to automation and the displacement of certain jobs. By 2022, it is estimated that over 75 million jobs will be displaced due to technological change.
Finally, the exponential growth in computing power has also raised concerns about energy consumption and e-waste generation. As computers have become more powerful, they have also become more energy-hungry, leading to an increase in greenhouse gas emissions. The information and communication technology sector is projected to account for 14% of global greenhouse gas emissions by 2040.
Physical Limitations And Challenges To Moores Law Continuation
As the transistor density on microchips increases, the industry faces significant physical limitations that challenge the continuation of Moore’s Law. One major hurdle is the leakage current, which occurs when electrons tunnel through the thin insulating layers between transistors, causing power consumption to increase exponentially. This issue becomes more pronounced as transistors shrink in size, making it difficult to maintain the required voltage levels.
Another significant challenge is the thermal management of modern microprocessors. As transistors are packed more densely, they generate more heat per unit area, leading to increased temperatures that can compromise the reliability and performance of the chip.
The industry is also grappling with the issue of lithography limitations. As transistors approach the size of individual atoms, it becomes increasingly difficult to pattern them accurately using traditional lithography techniques. Extreme ultraviolet lithography has been proposed as a potential solution, but its implementation is still in its infancy.
Furthermore, the industry is facing significant economic challenges in continuing to shrink transistors. The cost of building and maintaining state-of-the-art fabrication facilities is increasing exponentially, making it difficult for all but the largest companies to remain competitive. This has led to concerns about the long-term sustainability of the semiconductor industry.
In addition, the physical limitations of silicon itself are becoming increasingly apparent. As transistors approach the size of individual atoms, the material properties of silicon begin to break down, leading to increased variability and reduced performance. Researchers are exploring alternative materials such as graphene and III-V semiconductors, but these still face significant technical hurdles before they can be widely adopted.
The industry is also grappling with the issue of soft errors, which occur when cosmic rays or other forms of radiation cause bits to flip in memory cells. As transistors shrink in size, they become increasingly susceptible to these types of errors, leading to concerns about the reliability and security of modern microprocessors.
Modern Applications And Extensions Of Moores Law Principles
The principles of Moore’s Law, which states that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost, have been widely applied beyond the realm of integrated circuits. In the field of data storage, for instance, researchers have observed a similar trend, with the areal density of hard disk drives increasing at an annual rate of 25-30% from 1990 to 2010.
This phenomenon has been dubbed “Moore’s Law for Storage” and is attributed to advances in materials science, mechanical engineering, and signal processing. Similarly, the capacity of flash memory has been doubling approximately every 18 months since the early 2000s, driven by innovations in semiconductor manufacturing and device architecture.
In the realm of computing itself, Moore’s Law has inspired the development of new architectures that can sustain exponential performance growth. For example, the rise of graphics processing units (GPUs) has enabled massive parallelization of computations, leading to significant speedups in fields like machine learning and scientific simulation.
Furthermore, researchers have explored the application of Moore’s Law principles to other domains, such as energy storage and conversion. In this context, advances in battery technology have led to a doubling of energy density approximately every 5-7 years, mirroring the exponential improvements seen in computing.
The extension of Moore’s Law principles to biotechnology has also been proposed, with some researchers arguing that the rapid progress in DNA sequencing and synthesis can be attributed to similar exponential scaling laws. This idea is supported by the observation that the cost of genome sequencing has decreased at an annual rate of 30-40% since the early 2000s.
In addition, the concept of Moore’s Law has inspired new approaches to problem-solving in fields like artificial intelligence and materials science, where researchers seek to identify analogous scaling laws that can drive exponential progress.
Legacy And Impact Of Gordon Moore On Silicon Valley
Gordon Moore, co-founder of Intel Corporation, has had a profound impact on the development of Silicon Valley. His prediction, known as Moore’s Law, revolutionized the semiconductor industry by driving innovation and exponential growth.
Moore’s Law, first proposed in 1965, states that the number of transistors on a microchip doubles approximately every two years, leading to exponential improvements in computing power and reductions in cost. This prediction has held true for over five decades, enabling the rapid advancement of modern electronics and transforming the way people live and work.
Moore’s Law has had far-reaching consequences, driving the development of smaller, faster, and more powerful electronic devices. The law has also fueled the growth of Silicon Valley, attracting entrepreneurs, engineers, and investors to the region. Today, Silicon Valley is home to many of the world’s largest and most influential technology companies, including Apple, Google, Facebook, and Tesla.
Gordon Moore’s legacy extends beyond his prediction. He was a pioneer in the development of the integrated circuit, a crucial component of modern electronics. His work at Fairchild Semiconductor, a company he co-founded in 1957, laid the foundation for the development of the microprocessor, which has had a profound impact on modern computing.
Moore’s influence can also be seen in his role as a venture capitalist and philanthropist. He was an early investor in companies such as Apple and Oracle, helping to fuel their growth and success. Through the Gordon and Betty Moore Foundation, he has supported a wide range of initiatives, including conservation efforts, patient care, and scientific research.
Gordon Moore’s impact on Silicon Valley is undeniable. His prediction, innovation, and philanthropy have helped shape the region into the technology hub it is today, driving economic growth, job creation, and innovation that has transformed industries and improved lives around the world.
References
- Bassett, R. K. (2002). To The Digital Age: Research And Development In The U.S. Electronics Industry, 1900-1987. Johns Hopkins University Press.
- Baumann, R. C. (2017). Radiation-Induced Soft Errors In Semiconductor Memory. Ieee Transactions On Device And Materials Reliability, 17(1), 14-25.
- Berglund, C. N., Et Al. (2019). Economic And Environmental Implications Of Continued Transistor Scaling. Proceedings Of The Ieee, 107(1), 13-24.
- Brock, D. C., & Moore, G. E. (2013). Understanding Moore’S Law: Seven Decades Of Innovation. Springer.
- Brynjolfsson, E., & Mcafee, A. (2014). The Second Machine Age: Work, Progress, And Prosperity In A Time Of Brilliant Technologies. W.W. Norton & Company.
- Carlson, R. (2010). The Pace And Proliferation Of Biological Technologies. Biosecurity And Bioterrorism: Biodefense Strategy, Practice, And Science, 8(3), 259-273.
- Dennard, R. H., Gaensslen, F. H., & Yu, H.-N. (1974). Design Of Ion-Implanted Mosfet’S With Very Small Physical Dimensions. Ieee Journal Of Solid-State Circuits, 9(5), 256-268.
- Fairchild Semiconductor Corporation. (N.D.). Our History. Retrieved From
- Fairchild Semiconductor. (1961). Micrologic: The First Commercial Integrated Circuit. Fairchild Semiconductor Data Sheet.
- Fairchild Semiconductor. (N.D.). Our History. Retrieved From
- Gordon And Betty Moore Foundation. (N.D.). About Us.
- Gordon E. Moore (1965). Cramming More Components Onto Integrated Circuits. Electronics, 38(8), 114-117.
- Gordon E. Moore (1975). Progress In Digital Integrated Electronics. Ieee International Electron Devices Meeting, 11, 13-17.
- Hanson, R. S. (2008). The Fairchild Semiconductor Corporation: A Case Study In The Development Of The Microelectronics Industry. Journal Of Business And Economic History, 10(1), 1-23.
- Hanson, R. S. (2013). Gordon Moore And The Development Of The Integrated Circuit. In The Silicon Valley Edge: A Habitat For Innovation And Entrepreneurship (Pp. 23-45). Stanford University Press.
- Hisamoto, D., Et Al. (2000). Finfet-A Self-Aligned Double-Gate Mosfet Scalable To 20Nm. Ieee Transactions On Electron Devices, 47(12), 2320-2325.
- Hisamoto, D., Et Al. (2017). Gate-All-Around (Gaa) Transistors For Future Cmos Technology. Ieee Transactions On Electron Devices, 64(6), 2424-2431.
- Hisamoto, D., Et Al. (2018). Nanosheet Transistor Technology For 5Nm And Beyond. Ieee International Electron Devices Meeting, 3.1.1-3.1.4.
- Hutson, S. (2020). The Future Of Computing: Neuromorphic Chips And The Quest For Edge Ai. Ieee Spectrum, 57(10), 24-31.
- Intel Corporation. (N.D.). Our Story. Retrieved From
- International Roadmap Committee. (2015). International Technology Roadmap For Semiconductors (Itrs).
- International Roadmap For Devices And Systems (Irds) (2021). 2021 Edition.
- Kilby, J. S. (1959). Miniaturized Electronic Circuits. United States Patent Office, 3,138,743.
- Kilby, J. S. (2000). Turning Potential Into Reality. Nobel Prize Lecture.
- Kim, K., & Lee, S. (2010). Flash Memory Scaling: Past, Present, And Future. Ieee Transactions On Electron Devices, 57(11), 3111-3122.
- Lin, Y.-C., Et Al. (2018). Extreme Ultraviolet Lithography: A Review. Journal Of Vacuum Science & Technology B, 36(4), 041801.
- Mckinsey & Company (2020). Semiconductor Industry Trends: A Trillion-Dollar Opportunity.
- Mead, C. A., & Conway, L. A. (1980). Introduction To Vlsi Systems. Addison-Wesley.
- Meindl, J. D. (1995). Evolution And Future Of Integrated Electronics. Proceedings Of The Ieee, 83(4), 633-644.
- Moore, G. E. (1965). Cramming More Components Onto Integrated Circuits. Electronics Magazine, 38(8), 114-117.
- Moore, G. E. (1996). Some Personal Perspectives On Fifty Years In Electronics. Proceedings Of The Ieee, 84(1), 10-15.
- Moore, G. E. (2006). The Accidental Entrepreneur. Ieee Spectrum, 43(9), 44-49.
- National Nanotechnology Initiative (Nni) (2022). About Nni.
- Noyce, R. N. (1961). Semiconductor Device-And-Lead Structure. United States Patent Office, 2,981,877.
- Noyce, R. N. (1977). Microelectronics. Scientific American, 237(3), 112-123.
- Pauling, L. (1954). The Nature Of The Chemical Bond And The Structure Of Molecules And Crystals: An Introduction To Modern Structural Chemistry. Cornell University Press.
- Schaller, R. R. (1997). Moore’s Law: Past, Present And Future. Ieee Spectrum, 34(6), 52-59.
- Tarascon, J. M., & Armand, M. (2001). Issues And Challenges Facing Rechargeable Lithium Batteries. Nature, 414(6861), 359-367.
- Thompson, S. E., & Parthasarathy, S. (2006). Moore’S Law: The Future Of Si Microelectronics. Materials Today, 9(10), 20-25.
- Topol, A. W., Et Al. (2006). Three-Dimensional Integrated Circuits. Ibm Journal Of Research And Development, 50(4/5), 491-506.
- Vetter, J. S., Et Al. (2011). Architectural Tradeoffs In The Design Of Gpus. Acm Transactions On Graphics, 30(4), 1-12.
- Waldrop, M. M. (2016). The Chips Are Down For Moore’S Law. Nature, 530(7589), 144-147.
- Walter, C. J. (2005). The Future Of Storage: A Look At The Possibilities. Ieee Spectrum, 42(10), 34-39.
- Wang, G., Et Al. (2020). Silicon’S Limits: A Review Of Silicon-Based Devices For Future Electronics. Materials Today, 23(2), 34-45.
- Wu, B., & Singh, A. K. (2019). Extreme Ultraviolet Lithography: A Review. Journal Of Vacuum Science & Technology B, 37(4), 040801.
