Moore’s Law refers to the observation that the number of transistors on a microchip doubles approximately every two years, a trend that has driven exponential improvements in computing power for six decades. This principle, articulated by Intel co-founder Gordon Moore in 1965, became the north star of the semiconductor industry—not because it was a law of physics, but because the industry collectively decided to make it happen. Now, at the 60-year mark, that consensus is fracturing.
Key Takeaways
- Moore’s Law predicted transistor density would double every two years for 60 years.
- The semiconductor industry used Moore’s Law as a roadmap, not a physical law.
- Physics limitations are making further miniaturization increasingly difficult and expensive.
- Chip makers are exploring alternative approaches beyond traditional scaling.
- The end of Moore’s Law does not mean the end of computing progress.
How Moore’s Law Became Industry Gospel
Moore’s Law was never physics—it was prophecy that became self-fulfilling. When Moore made his observation in 1965, he was describing a trend he had noticed in the early semiconductor industry. What made it remarkable was not its accuracy at that moment, but the fact that the entire chip manufacturing ecosystem decided to treat it as a binding commitment. Engineers designed roadmaps around it. Investors funded companies based on it. Entire supply chains optimized for it.
For nearly 50 years, the industry delivered. Transistor counts climbed from thousands to millions to billions. Chip sizes shrank. Power consumption per transistor fell. The cost per transistor plummeted. This wasn’t magic—it was relentless engineering, billions in R&D spending, and incremental breakthroughs in photolithography, materials science, and process control. Each generation required solving problems that seemed impossible until they were solved.
The result was the computing revolution we take for granted: the smartphone in your pocket contains more transistors than existed on Earth in 1990. That would not have happened without Moore’s Law as the organizing principle.
Where Moore’s Law Breaks Down
The problem is simple: you cannot shrink transistors forever. At some point, quantum effects take over. Electrons tunnel through barriers they should not be able to cross. Heat dissipation becomes the limiting factor. Manufacturing costs explode exponentially. We are not quite at the absolute physical limit yet, but we are close enough that the easy gains are gone.
The semiconductor industry has been pushing against these limits for the past decade. Transistor density improvements have slowed. The cost of building a new fabrication plant has crossed into the tens of billions of dollars. Smaller process nodes no longer deliver proportional performance improvements. A 3-nanometer chip is not three times faster than a 5-nanometer chip—it might be 15 to 20 percent faster, if you are lucky.
This is not a failure of engineering. It is the collision between an exponential curve and physical reality. Moore’s Law assumed you could keep doubling transistor density indefinitely. Physics says otherwise.
What Comes After Moore’s Law
The semiconductor industry is not waiting for Moore’s Law to formally die. It is already pivoting. One approach is to stop chasing pure density and instead focus on specialized chips designed for specific workloads—AI accelerators, graphics processors, neural engines. Another is to stack transistors vertically, adding layers rather than shrinking horizontally. A third is to explore entirely new materials and architectures: photonic chips, quantum processors, neuromorphic designs.
None of these approaches will deliver the predictable, exponential improvement that Moore’s Law provided. That is both the problem and the opportunity. Without a single unifying roadmap, the semiconductor industry will become more fragmented. Different chips will improve at different rates depending on their use case. A gaming GPU might improve faster than a general-purpose CPU. An AI chip might improve faster than both. This is messier than Moore’s Law, but it might be more innovative.
The end of Moore’s Law does not mean the end of computing progress. It means progress will look different: less predictable, more specialized, driven by application-specific needs rather than a universal scaling law.
Does Moore’s Law Still Apply Today?
Moore’s Law technically still holds at the highest level of abstraction—the industry continues to pack more capability into the same space, even if it is not always more transistors. But the spirit of Moore’s Law, the predictable doubling of transistor density every two years, is already dead for practical purposes. We are in the twilight of the era it defined.
What Happens to Computing Without Moore’s Law?
Computing does not stop. Progress does not stop. What changes is the mechanism. For 60 years, you could bet on Moore’s Law: buy a chip today, and in two years a new generation would be twice as dense, faster, and cheaper. That bet is no longer reliable. The future of computing will be driven by specialized architectures, heterogeneous designs, and domain-specific optimization rather than a universal scaling law. It will be harder to predict, but potentially more efficient.
Can Chip Makers Still Improve Performance?
Yes, but not through miniaturization alone. Chip makers can improve performance by redesigning architectures, adding specialized cores, using better materials, and optimizing for specific workloads. A chip designed for AI does not need to be a general-purpose processor—it can be optimized for matrix multiplication and other AI-specific operations. This approach delivers real performance gains without relying on Moore’s Law.
The 60-year run of Moore’s Law was extraordinary. It powered the digital revolution, made computing ubiquitous, and reshaped civilization. But it was always a trend, not a law. And trends, by definition, end. The semiconductor industry is already preparing for what comes next—a more fragmented, specialized, and harder-to-predict landscape. It will not be as elegant as Moore’s Law. But it might be more interesting.
This article was written with AI assistance and editorially reviewed.
Source: TechRadar


