These powerful machines are incredibly efficient, but their power consumption is staggering. The electricity consumed by a single AI model in training is equivalent to the lifetime power consumption of five cars. Training a GPT-4 system using all the text on the internet consumed over $100 million in electricity, and it still doesn't speak very well.
Training isn't the only costly aspect. One data scientist estimated that in January of this year, large language models consumed the equivalent of the electricity used by 175,000 people. The training phase is typically the most energy-intensive stage for AI models, but the strong public interest in using these services could generate exorbitant electricity bills indefinitely.
As demand for artificial intelligence services surges, environmental pollution has become increasingly serious and cannot be ignored. We are facing an energy crisis, and clearly we need to change course.
Are we going to use artificial intelligence to destroy the Earth? Well, we should see how nature does it.
Nature is constantly calculating in a quiet and elegant way, and with remarkable energy efficiency. From trees converting sunlight into food to the human brain processing complex information, nature's calculations are both intricate and sustainable. If nature can do it, why can't our machines? Clearly, our current attitude towards artificial intelligence is fundamentally flawed.
Fortunately, there is a glimmer of hope in the field of quantum computing. This emerging field utilizes the principles of quantum mechanics to perform complex calculations at a speed far exceeding that of traditional computers. Just as nature uses quantum effects in photosynthesis, we can leverage quantum computing to run artificial intelligence systems with extremely low energy consumption.
Power consumption of high-performance computers
As the overall number and usage of artificial intelligence models continue to grow, it is worth considering the energy needed to power the machines that run these algorithms.
Frontier is currently the world's most powerful supercomputer, consuming 21.1 megawatts of electricity annually, amounting to a staggering $23 million. When engineers at Oak Ridge National Laboratory in Tennessee built Frontier, the office space surrounding the computer had to be converted into a substation to ensure it had sufficient power. Even when idle, Frontier consumes 8 megawatts of electricity. One megawatt can typically power 1,000 European homes.
Besides consuming a significant portion of the world's energy, these supercomputers also have a major environmental impact in the form of emissions. In 2022, China had the most supercomputers with 172, followed by the United States with 128. Coal is by far the most common energy source in Asia and is expected to remain so for the next decade. In the United States, coal accounts for 60% of the overall energy mix. In Europe, solar and wind power reached parity with nuclear power for the first time in 2022, but Europe has only 71 supercomputers. All of this supercomputing is contributing to greenhouse gas emissions that disrupt weather patterns and contribute to global warming.
Even for research scientists trying to reduce their carbon footprint, the use of supercomputers has made it impossible to achieve their goal. A recent study calculated the carbon footprint of astronomers at an Australian university. On average, each astronomer generated 15 tons of emissions from supercomputer use alone, far exceeding the emissions from air travel and observatory work, both of which are in the single digits.
Improving the efficiency of artificial intelligence using quantum computing
Just as the world transitioned from gasoline-powered cars to electric vehicles, businesses, universities, and governments could consider using quantum computing to reduce the carbon footprint of supercomputing. This is a promising avenue for making artificial intelligence not only smarter but also more environmentally friendly. For example, current exascale and petaflop supercomputers typically require around 15 to 25 megawatts of power to operate, while a typical quantum computer consumes about 25 kilowatts.
Furthermore, we have seen the emergence of quantum-inspired computing—algorithms that simulate quantum processes but run on classical machines. These algorithms can significantly reduce power consumption compared to traditional AI systems.
For example, one can improve the memory performance of neural networks by leveraging factors such as deep learning, convolutional networks, Transformers, and other computationally demanding processes.
Computers use neural networks to learn tasks by analyzing training examples. The network consists of thousands of tightly interconnected processing nodes. These nodes are organized in layers, each assigned weights and a threshold. If a node's output exceeds the threshold, that node is activated and passes the data to the next layer.
Modern CPUs and GPUs can support networks with up to 50 layers. Once trained for accuracy, these networks can classify and cluster data at high speed. They can perform tasks such as handwriting analysis, speech-to-text transcription, and weather forecasting.
When quantum computers enter the fault-tolerant era, researchers can use operations on qubits as artificial neurons in neural networks.
At the same time, thanks to quantum-inspired technology, the company can run networks with a large number of neurons in each layer at minimal energy cost, thereby significantly reducing energy consumption.
One reason high-performance computing centers are interested in quantum computing is its potential to reduce overall power consumption. As traditional supercomputers become increasingly powerful, their power consumption increases almost exponentially. Quantum computers, on the other hand, offer exponentially increasing computing power, but their associated power consumption increases linearly.
There is some debate about whether quantum computers can consume less energy than traditional computers. The supporting infrastructure requires significant electricity, and the hardware design also presents certain requirements.
The Quantum Energy Initiative brings together 300 participants from over 46 countries, spanning fields from fundamental quantum physics to technology, hardware to software, research to industry, aiming to track energy use and the growth of quantum computing power. The initiative seeks to understand progress related to resource consumption. The organization's goals include defining energy-based metrics for all quantum technologies and finding ways to minimize the energy cost of quantum processes.
Quantum computing and quantum-heuristic computing are helping to solve computational energy challenges. Energy companies are grappling with intractable optimization and machine learning problems, such as energy market optimization and production forecasting. Quantum computing has proven to be not only more environmentally friendly but also better suited to addressing these challenges than traditional, energy-inefficient computing.
Quantum computing and quantum-inspired computing are not merely replacements for traditional computing, but necessities. The road to a quantum-driven, energy-efficient artificial intelligence revolution is long and challenging. We are battling global warming and exponentially increasing energy demands. But every step forward in quantum technology brings us closer to the dream of sustainable and intelligent artificial intelligence.