According to the online edition of The New York Times, in 1960, at the International Solid-State Circuits Conference held at the University of Pennsylvania, a young computer engineer named Douglas Engelbart introduced a simple but groundbreaking concept: scaling.
Engelbart theoretically explained that as circuit sizes decrease, component speeds increase, and energy consumption and manufacturing costs decrease—all of which accelerate. Engelbart later invented the computer mouse and other personal computing technologies.
Among the audience that day listening to Engelbart's speech was Gordon Moore, who later co-founded Intel. In 1965, Moore quantified the shrinking principle, proposing Moore's Law, which had a profound impact on the computer age. He predicted that the number of transistors integrated into a chip would double every year for at least 10 years, leading to a significant increase in computer processing power.
His prediction, published in the April 1965 issue of the journal *Electronics*, later became known as Moore's Law. This wasn't a law of physics, but rather an observation of a nascent industry, which has proven effective over the past half-century.
In the early 1960s, a transistor about the width of a cotton fiber cost about $8 (approximately 51 RMB) in today's dollars. Now, a chip the size of a fingernail can integrate billions of transistors, and the price of transistors has dropped to the point where you can buy many for just one cent (about 6 cents).
The development of computer chips has helped Silicon Valley bring astonishing progress to the world, including PCs, smartphones, and the internet. However, in recent years, the pace of chip development predicted by Moore's Law has slowed. About 10 years ago, chip speeds stopped increasing further, the time to release new generations of chips became longer, and the cost of a single transistor stopped falling.
The New York Times reports that technology experts now believe the emergence of next-generation chips will be slower, with the interval between two generations extending to 2.5-3 years. They worry that by the mid-2020s, transistors, composed of only a few molecules, will no longer function reliably. Unless new technological breakthroughs occur, the era of Moore's Law will come to an end.
Broadcom Chief Technology Officer Henry Samueli said of Moore's Law, "It's gray-haired and old. Moore's Law isn't dead, but it's about to retire."
In 1995, Moore revised the timeframe for doubling the number of transistors to two years. He considered it remarkable that Moore's Law had remained effective for such a long time, and recently stated at a conference commemorating the 50th anniversary of Moore's Law, "Initially, we considered it to be effective for 10 years, and I think that's long enough."
But one question remains: what happens when this combination of ever-increasing speed, ever-decreasing energy consumption, and lower prices becomes unsustainable?
Robert P. Colwell, a former Intel electronics engineer who led the design of the Pentium chip, says the impact of this situation extends far beyond the computer industry.
"Take the automotive industry as an example," Kewell said. "Moore's Law has driven innovation in the automotive industry over the past 30 years." Most innovations in the automotive industry, such as engine controllers, anti-lock braking systems, navigation, entertainment, and safety systems, have come from increasingly cheaper semiconductors.
Silicon Valley, however, does not share this concern. For over 30 years, the computing industry has claimed that computing speeds will be faster, capacity will be greater, and prices will be lower. This has been described as the Internet age, or even the singularity (i.e., the processing power of computers will surpass human intelligence).
Physical limits
A chip is composed of metal interconnects and transistors based on semiconductor materials. The width of the most advanced transistors and interconnects is smaller than the wavelength of light, and the size of the most advanced electronic switches is smaller than a biological virus.
Chips are manufactured using photolithography. Since its invention in the late 1950s, photolithography has been continuously evolving. Currently, chip photolithography has progressed to the point of using ultraviolet lasers.
With components and interconnects now shrunk to the size of just a few molecules, engineers are employing computer simulation techniques in chip design. "It's playing with physics," says Walden C. Rhines, CEO of design automation software company Mentor Graphics.
If this “shrinkage” first described by Engelbert cannot continue, how should large chip manufacturers respond? The New York Times suggests that, firstly, they could turn to software or new chip designs to “squeeze” more computing power from the same number of transistors.
In addition, the chip industry is pinning its hopes on new materials. Alex Lidow, CEO of Efficient Power Conversion Corporation, a specialized chip manufacturer and a physicist, said that other materials could replace silicon and be used to produce smaller transistors, new types of memory devices, and optical communication equipment.
There are also many new technologies, such as quantum computing—which, if put into practical use, will greatly increase computing speed; and spintronics—which could enable future computing technologies to enter the era of atomic-scale devices.
Recently, the industry has been very optimistic about a technology called extreme ultraviolet lithography. If successful, it will allow chip manufacturers to use more advanced processes to produce chips, while simplifying the chip manufacturing process. However, this technology has not yet been proven in commercial production.
Earlier this year, Dutch lithography machine manufacturer ASML announced that it had secured a huge order for extreme ultraviolet lithography machines from a US customer. Most industry insiders believe that the customer is Intel, which means that Intel will be one step ahead of other chip manufacturers in terms of manufacturing processes.
Unlike its main competitors such as Samsung and TSMC, Intel executives firmly believe that the company can continue to reduce chip manufacturing costs in the foreseeable future, and they do not agree with the view that transistor prices have stabilized.
Nevertheless, Intel cannot completely disregard physics. In July, Intel announced it would delay the adoption of its 10-nanometer process technology until 2017. This disrupts Intel's previous product release cycle of switching to a new manufacturing process one year and adopting a new chip architecture the following year.
Intel CEO Brian Krzanich said in an analyst call, "The two most recent technology transitions have shown that our cycle time for adopting new processes is closer to two and a half years rather than two years."
No more ridesharing
The New York Times points out that an optimistic view of these issues is that the slowdown in chip development will lead to more intense competition and greater creativity. Many semiconductor manufacturers do not have the advanced manufacturing facilities of the Big Four chipmakers: GlobalFoundries, Intel, Samsung, and TSMC.
Harvard Business School professor David B. Yoffie said that the slowdown in chip manufacturing process development may allow slightly lagging manufacturers to compete in markets that do not require the most advanced performance.
Even if shrinking transistor sizes don't make chips faster or cheaper, they will reduce chip power consumption. Ultra-low-power computer chips, expected to be available in the late 2010s, may not even require battery power in some cases, instead being powered by solar energy, vibration, radio waves, or even sweat.
What kind of products will these chips give rise to? Nobody knows. But product designers will be forced to think about the products they develop differently, instead of waiting for more powerful chips. Thanks to Moore's Law, computers are getting smaller and smaller, but there haven't been any breakthroughs in design. "In the past, designers were lazy," said Tony Fadell, a former Apple executive.
"In the past, we were basically hitching a ride, which was really stupid, but it worked," said physicist Carver Mead.
In fact, Moore's Law will remain valid for at least the next 10 years. If not, humanity will have to be much more creative.