Quantum in Context: Study of Quantum Computing Energy Efficiency

Quantum in Context: Study of Quantum Computing Energy Efficiency

The News: The French National Centre for Scientific Research (CNRS), electric utility company EDF, and quantum computing companies Quandela and Alice & Bob are collaborating to study and optimize the energy use of quantum computing systems, particularly compared to High-Performance Computing systems. See the press release for more details.

Quantum in Context: Study of Quantum Computing Energy Efficiency

Analyst Take: Along with the somewhat arcane quantum computing metrics such as coherence time, 2-qubit gate fidelity, and qubit count, a new one is becoming significant as quantum computing enters its middle phase: energy consumption. It’s a simple question: how much energy will it take to execute a particular algorithm on a problem of a given size? It’s not easy to answer, and we must make assumptions. While I don’t usually like to consider announcements instead of results, we must accelerate this work now. Extrapolating sustainability results into the future is tricky and may involve too many variables to provide accurate forecasts.

Quantum Computing and Sustainability

We don’t need quantum computers for small problems. They are inefficient and much too slow. Instead, for a given problem or algorithm, there is a size threshold for which quantum computing becomes a better choice. Compared to a classical approach, the quantum solution may be significantly

  • faster,
  • more memory-efficient,
  • less expensive,
  • more accurate, or
  • more energy-efficient.

Note that I said “or” and not “and.” Suppose we fast-forward some safe distance into the future when quantum computing systems provide significant advantages over classical systems. At that time, we may need to trade off some of these factors versus the others. Will you be willing to use half as much energy if the cost or calculation time is twice as high?

If a quantum computing system can do something impossible for a classical system, the question may not be about trade-offs but more of the nature, “Do we want to compute this at all?”

Sustainability is an immense consideration for AI data centers today and their components. All major vendors examine energy use, heat generation, and cooling requirements. We must expect the same to happen for quantum computing.

Vendors are already employing energy use in their competitive positioning against others. The usual targets are superconducting systems providers with large external refrigerators. Note, though, that those who claim they provide “room temperature quantum computing” rarely define the term. The refrigerator in my kitchen works at room temperature, at least the outside of it.

Why Compare to High-Performance Computing?

The press release states:

The project, named “Energetic Optimisation of Quantum Circuits” (OECQ) will, in the first phase, compare the energy requirements of high-performance computing (HPC) systems with those of quantum computers.

On the surface, this is reasonable. We use HPC computers to solve computationally intensive problems and believe we can use future quantum computers to do that too. However, there are two problems:

  1. We cannot use the same algorithms on HPC and quantum computing systems.
  2. Today’s HPC systems are large and powerful; quantum systems are small, non-error-corrected, and don’t solve any practical problems we care about.

Just because a problem is hard, it does not imply that a quantum computer can do better than HPC. Via theoretical computer science, we know there are only several classes of problems where we expect quantum to do better than classical. So, the first order of business for the study should be to define the problem areas common to both architectures and the corresponding algorithms. Remember, classical techniques are constantly improving, and AI data centers are becoming more powerful for general computation.

Within the HPC solutions, there may be particular bottlenecks where quantum can help. It’s not all HPC or all quantum; an integrated mixture of the two will likely provide the best balance of the tradeoffs I list above. (Some authors write “hybrid” instead of “integrated,” but I think the former word is vague and overused.)

Is It Possible to Extrapolate to Solutions with Practical Quantum Advantage?

In the quantum industry, it’s common to hear statements like, “Theoretically, with the new error-correcting codes, we’ll need far fewer qubits than we previously thought.” Note that italics, but even if we need “only” 100,000 physical qubits, the largest working experimental quantum processing unit (QPU) is the IBM Condor at 1,121, an impressive achievement. Most vendors have fewer than 100 qubits in their QPUs. However, I believe the future is a modular approach using external quantum connections among QPUs. Although several vendors are doing serious work on this front, the modular approach is not yet common.

Can we really extrapolate the energy requirements for a few dozen qubits to 100,000 quantum-networked ones?

Although I have highlighted several difficulties in comparing quantum systems to HPC and future systems, I hope this study lays a comprehensive foundation for which to model energy use better. Several efforts are already underway. See, for example, the paper “Are quantum computers really energy efficient?” in Nature Computational Science.

Alice & Bob use superconducting “cat” qubits, while Quandela uses photonic qubits. I expect the study to focus on those modalities and consider the usual variations. I hope the project provides precise details so other vendors can measure their systems and perform extrapolations. It may turn out that neither initial modality is the most energy-efficient. A third party may compute such sustainability ratings across vendors in the future.

Key Takeaway

Sustainability is a prime consideration for all forms of computing today. While quantum computing is nascent, we should study and model the projected energy use of these systems for the problems we most care about. The intentions of this project are good, but there are many practical challenges ahead. The participants must ultimately present the results in a marketing-free and vendor-neutral manner to be accepted by the quantum, IT, and sustainability communities.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author is a former employee of IBM and holds an equity position in the company. The author does not hold an equity position in any other company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Quantum in Context: A Qubit Primer

Quantum QuickTake: Qubit News from Alice & Bob, Diraq, and Quantinuum

Quantum in Context: The Case for On-Premises Quantum Computers

Author Information

Dr. Bob Sutor

Dr. Bob Sutor is a Consulting Analyst for Futurum and an expert in quantum technologies with 40+ years of experience. He is an accomplished author of the quantum computing book Dancing with Qubits, Second Edition. Bob is dedicated to evolving quantum to help solve society's critical computational problems. For Futurum, he helps clients understand sophisticated technologies and how to make the best use of them for success in their organizations and industries.

He’s the author of a book about quantum computing called Dancing with Qubits, which was published in 2019, with the Second Edition released in March 2024. He is also the author of the 2021 book Dancing with Python, an introduction to Python coding for classical and quantum computing. Areas in which he’s worked: quantum computing, AI, blockchain, mathematics and mathematical software, Linux, open source, standards management, product management and marketing, computer algebra, and web standards.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.