ACM Transactions on Quantum Computing

Go to ACM Transactions on Quantum Computing Volume 5, Issue 3

Volume 5, Issue 3

September 2024

acm

ACM Transactions on Quantum Computing publishes high-impact, original research papers and selected surveys on topics in quantum computing and quantum information science. The journal targets the quantum computer science community with a focus on the theory and practice of quantum computing including but not limited to: models of quantum computing, quantum algorithms and complexity, quantum computing architecture, principles and methods of fault-tolerant quantum computation, design automation for quantum computing, quantum programming languages and systems, distributed quantum computing, quantum networking, quantum security and privacy, and applications (e.g. in machine learning and AI) of quantum computing.

research paper about quantum computers

Subject Areas

Announcements.

Editors’ Suggestion

Learning Quantum Processes and Hamiltonians via the Pauli Transfer Matrix , published in issue 2, volume 5, reports new results for efficiently learning the complex processes that characterize quantum systems. The article shows how process learning is possible with linearly many copies of the corresponding Choi state when a quantum memory is available, whereas the case without quantum memory requires exponentially many queries to the unknown process.

TQC Accepted for Scopus and Clarivate ESCI Coverage

ACM  Transactions on Quantum Computing  (TQC) has been accepted for coverage in Elsevier’s Scopus and Clarivate’s ESCI. Similar to Web of Science, Scopus is an extensive yet selective abstract and citation database that provides comprehensive coverage of peer-reviewed journals, books, conference abstracts, and patents across the natural sciences, social sciences, arts, and humanities. By having its content included in Scopus, TQC's content will be discoverable at 7,000 of the world's top research institutions.

ACM Updates Its Peer Review Policy

ACM is pleased to announce that its Publications Board has approved an updated  Peer Review Policy . If you have any questions regarding the update, the associated  FAQ  addresses topics such as confidentiality, the use of large language models in the peer review process, conflicts of interest, and several other relevant concerns. If there are any issues that are not addressed in the FAQ, please contact ACM’s Director of Publications,  Scott Delman .

New ACM Policy on Authorship ACM has a new Policy on Authorship , covering a range of key topics, including the use of generative AI tools.  Please familiarize yourself with the new policy and the associated list of Frequently Asked Questions .

Most Frequent Affiliations

Most cited authors, latest issue.

  • Volume 5, Issue 3 September 2024 Issue-in-Progress EISSN: 2643-6817 View Table of Contents

Optimization Applications as Quantum Performance Benchmarks

Quantum Circuits Inc., New Haven, United States

Advanced Network Science Initiative, Los Alamos National Laboratory, Los Alamos, United States

Author Picture

D-Wave Systems Inc, Burnaby, Canada

Department of Physics and Astronomy, University of California Los Angeles, Los Angeles, United States, Theoretical Division (T-4), Los Alamos National Laboratory, Los Alamos, United States and Research Institute of Advanced Computer Science, Universities Space Research Association, Mountain View, USA

Applied Physics Laboratory, Johns Hopkins University, Baltimore, United States

Author Picture

Research Institute of Advanced Computer Science, Universities Space Research Association, Mountain View, United States, Quantum Artificial Intelligence Laboratory, NASA Ames Research Center, Mountain View, United States and Purdue University System, West Lafayette, United States

An Algorithm for Reversible Logic Circuit Synthesis Based on Tensor Decomposition

The Affiliated Institute of ETRI, Daejeon, Korea (the Republic of)

Recent Award Winners

Most popular, other acm journals.

ACM Journal on Computing and Sustainable Societies cover image

Volume 2, Issue 2

Collective Intelligence cover image

Volume 3, Issue 2

April-June 2024

ACM Computing Surveys cover image

Volume 56, Issue 12

December 2024

Digital Government: Research and Practice cover image

Volume 5, Issue 2

Distributed Ledger Technologies: Research and Practice cover image

Volume 36, Issue 2

Export Citations

  • Please download or close your previous search result export first before starting a new bulk export. Preview is not available. By clicking download, a status dialog will open to start the export process. The process may take a few minutes but once it finishes a file will be downloadable from your browser. You may continue to browse the DL while the export process is in progress. Download
  • Download citation
  • Copy citation

We are preparing your search results for download ...

We will inform you here when the file is ready.

Your file of search results citations is now ready.

Your search export query has expired. Please try again.

Archives of Quantum Computing: Research Progress and Challenges

  • Review article
  • Published: 12 July 2023
  • Volume 31 , pages 73–91, ( 2024 )

Cite this article

research paper about quantum computers

  • Vaishali Sood 1 &
  • Rishi Pal Chauhan 1  

2018 Accesses

9 Citations

1 Altmetric

Explore all metrics

Quantum computing is a revolutionary concept among emerging technologies visioned by researchers. The interdisciplinary nature of quantum computing evolves as cross-pollination of ideas, techniques, and methodologies. Henceforth, a comprehensive analysis of the literature is conducted to insight the progression of quantum computing research. Our study unfurls the intellectual landscape of major research domains in quantum computing including fiducial quantum state initialization, quantum superposition, quantum coherence, fault-tolerence and quantum algorithms. It assesses the prominence of the field through co-citation network analysis and burst reference analysis to unveil research trends that can be interweaved for the realization of quantum computers. The research findings reveal that photons, squids, nuclear magnetic resonance, semiconductor quantum dots, cryogenic temperatures, quantum machine learning, and support vector machines are the core research areas. Further, a meta-literature analysis of the research domain is carried out to extract the evolutionary pathways for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

research paper about quantum computers

Similar content being viewed by others

research paper about quantum computers

Extraction of emerging trends in quantum algorithm archives

research paper about quantum computers

Survey of Open-Source Tools/Industry Tools to Develop Quantum Software

research paper about quantum computers

Models in quantum computing: a systematic review

Explore related subjects.

  • Quantum Computing

Allende M et al (2023) Quantum-resistance in blockchain networks. Sci Rep 13:5664

Google Scholar  

DiVincenzo DP (2000) The physical implementation of quantum computation. Fortschritte der Physik: Progress of Phys 48:771–783

Hood W, Wilson C (2001) The literature of bibliometrics, scientometrics, and informetrics. Scientometrics 52:291–314

Meho LI, Rogers Y (2008) Citation counting, citation ranking, and h-index of human-computer interaction researchers: a comparison of scopus and web of science. J American Soc for Inform Sci Technol 59:1711–1726

Sood S, Rawat K, Sharma G (2022) 3-d printing technologies from infancy to recent times: a scientometric review. IEEE Trans Eng Manag. https://doi.org/10.1109/TEM.2021.3134128

Neelam S, Sood SK (2020) A scientometric review of global research on smart disaster management. IEEE Trans Eng Manag 68:317–329

Song J, Zhang H, Dong W (2016) A review of emerging trends in global ppp research: analysis and visualization. Scientometrics 107:1111–1147

Saini K, Sood SK (2021) Exploring the emerging ict trends in seismic hazard by scientometric analysis during 2010–2019. Environ Earth Sci 80:1–25

Sood SK, Rawat KS, Kumar D (2022) Analytical mapping of information and communication technology in emerging infectious diseases using citespace. Telemat Inform 69:101796

Kaur A, Ten Sood SK (2020) years of disaster management and use of ict: a scientometric analysis. Earth Sci Inform 13:1–27

van Erp T, Gładysz B (2022) Quantum technologies in manufacturing systems: perspectives for application and sustainable development. Procedia CIRP 107:1120–1125

Bassman L et al (2021) Simulating quantum materials with digital quantum computers. Quantum Sci Technol 6:043002

Preskill J (2018) Quantum computing in the nisq era and beyond. Quantum 2:79

Lo S-C, Shih Y-C (2021) A genetic algorithm with quantum random number generator for solving the pollution-routing problem in sustainable logistics management. Sustainability 13:8381

Edwards M, Mashatan A, Ghose S (2020) A review of quantum and hybrid quantum/classical blockchain protocols. Quantum Information Processing 19:1–22

MathSciNet   Google Scholar  

Scheidsteger T, Haunschild R, Bornmann L, Ettl C (2021) Bibliometric analysis in the field of quantum technology. Quantum Rep 3:549–575

Sharma N, Ketti Ramachandran R (2021) The emerging trends of quantum computing towards data security and key management. Archiv Comput Methods Eng 1:14

Singh J, Bhangu KS (2023) Contemporary quantum computing use cases: Taxonomy, review and challenges. Arch Comput Methods Eng 30:615–638

Coccia M, Roshani S, Mosleh M (2022) Evolution of quantum computing: theoretical and innovation management implications for emerging quantum industry. IEEE Trans Eng Manag. https://doi.org/10.1109/TEM.2022.3175633

Zhao L, Tang Z-Y, Zou X (2019) Mapping the knowledge domain of smart-city research: A bibliometric and scientometric analysis. Sustainability 11:6648

Sood SK, Rawat KS (2021) A scientometric analysis of ict-assisted disaster management. Natural hazards 106:2863–2881

Arute F et al (2019) Quantum supremacy using a programmable superconducting processor. Nature 574:505–510

Kjaergaard M et al (2020) Superconducting qubits: current state of play. Annual Rev Condensed Matter Phys 11:369–395

Gu X et al (2021) Fast multiqubit gates through simultaneous two-qubit gates. PRX Quantum 2:040348

Kandala A et al (2017) Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets. Nature 549:242–246

Barends R et al (2014) Superconducting quantum circuits at the surface code threshold for fault tolerance. Nature 508:500–503

Barends R et al (2013) Coherent josephson qubit suitable for scalable quantum integrated circuits. Phys rev lett 111:080502

Monz T et al (2011) 14-qubit entanglement: Creation and coherence. Phys Rev Lett 106:130506

Ladd T et al (2010) Quantum computer. Nature 464:45–53

Browne DE, Rudolph T (2005) Resource-efficient linear optical quantum computation. Phys Rev Lett 95:010501

Grover LK (1997) Quantum mechanics helps in searching for a needle in a haystack. Phys Rev Lett 79:325

Brown K, Lidar D, Whaley K (2001) Quantum computing with quantum dots on quantum linear supports. Phys Rev A 65:012307

Farhi E, Goldstone J, Gutmann S (2014) A quantum approximate optimization algorithm. arXiv preprint arXiv:1411.4028

Ajagekar A, You F (2019) Quantum computing for energy systems optimization: challenges and opportunities. Energy 179:76–89

Neukart F, Dollen DV, Seidel C (2018) Quantum-assisted cluster analysis on a quantum annealing device. Front Phys 6:55

Watson T et al (2018) A programmable two-qubit quantum processor in silicon. Nature 555:633–637

Yoneda J et al (2018) A quantum-dot spin qubit with coherence limited by charge noise and fidelity higher than 99.9%. Nat nanotechnol 13:102–106

Yang CH et al (2020) Operation of a silicon quantum processor unit cell above one kelvin. Nature 580:350–354

Guo X et al (2020) Distributed quantum sensing in a continuous-variable entangled network. Nat Phys 16:281–284

Raussendorf R, Browne DE, Briegel HJ (2003) Measurement-based quantum computation on cluster states. Phys rev A 68:022312

Mourik V et al (2012) Signatures of majorana fermions in hybrid superconductor-semiconductor nanowire devices. Science 336:1003–1007

Hasan M, Kane C (2010) Colloquium: topological insulators. Phys rev lett 82:3045

Nielsen MA, Chuang I (2002) Quantum computation and quantum information

Liu C et al (2020) Zero-energy bound states in the high-temperature superconductors at the two-dimensional limit. Sci adv 6:eaax7547

Fu L, Kane CL (2008) Superconducting proximity effect and majorana fermions at the surface of a topological insulator. Phys rev lett 100:096407

Hornibrook J et al (2015) Cryogenic control architecture for large-scale quantum computing. Phys Rev Appl 3:024010

Charbon E, et al (2016) Cryo-cmos for quantum computing. In 2016 IEEE International Electron Devices Meeting (IEDM), 13–5 IEEE

Sood V, Chauhan RP (2023) Towards quantum state preparation with materials science: an analytical review. Int J Quantum Chem e27148. https://doi.org/10.1002/qua.27148

Weber J et al (2010) Quantum computing with defects. Proceedings of the National Academy Sci 107:8513–8518

Parthasarathy SK et al (2023) Scalable quantum memory nodes using nuclear spins in silicon carbide. Phys Rev Appl 19:034026

Glaser NJ, Roy F, Filipp S (2023) Controlled-controlled-phase gates for superconducting qubits mediated by a shared tunable coupler. Phys Rev Appl 19:044001

Ahmad HG et al (2023) Investigating the individual performances of coupled superconducting transmon qubits. Condensed Matter 8:29

Sung KJ, Rančić MJ, Lanes OT, Bronn NT (2023) Simulating majorana zero modes on a noisy quantum processor. Quantum Sci Technol 8:025010

Kang M, Liang Q, Li M, Nam Y (2022) Efficient motional-mode characterization for high-fidelity trapped-ion quantum computing. arXiv preprint arXiv:2206.04212

Liu Y-L et al (2023) Fast conversion of three-particle dicke states to four-particle dicke states with rydberg superatoms. Adv Quantum Technol 2200173. https://doi.org/10.1002/qute.202200173

Kim K, Ahn J (2023) Quantum tomography of rydberg atom graphs by configurable ancillas. PRX Quantum 4:020316

Plesch M, Brukner Č (2011) Quantum-state preparation with universal gate decompositions. Phys Rev A 83:032302

Ghasemian E (2023) Dissipative quantum computation and quantum state preparation based on bec qubits. JOSA B 40:247–259

Li C-L et al (2023) All-photonic quantum repeater for multipartite entanglement generation. Optics Lett 48:1244–1247

Liu S-C, Cheng L, Yao G-Z, Wang Y-X, Peng L-Y (2023) Efficient numerical approach to high-fidelity phase-modulated gates in long chains of trapped ions. Phys Rev E 107:035304

Liao M-J et al (2023) Generation of triple-entanglement in second-order optical topological kagome structure. JOSA B 40:912–921

Dong M, et al (2023) Programmable photonic integrated meshes for modular generation of optical entanglement links. npj Quantum Information 9 , 42

Sakhouf H, Daoud M, Laamara RA (2023) Quantum process tomography of the single-shot entangling gate with superconducting qubits. J Phys B: Atomic, Mol Optical Phys 56:105501

Vesperini A, Bel-Hadj-Aissa G, Franzosi R (2023) Entanglement and quantum correlation measures for quantum multipartite mixed states. Sci Rep 13:2852

Cortés-Vega J, Barra J, Pereira L, Delgado A (2023) Detecting entanglement of unknown states by violating the clauser-horne-shimony-holt inequality. Quantum Inform Process 22:1–24

Xu H, Kee H-Y (2023) Creating long-range entangled majorana pairs: From spin-1 2 twisted kitaev to generalized x y chains. Phys Rev B 107:134435

Bostelmann M, Wilksen S, Lohof F, Gies C (2023) Multipartite-entanglement generation in coupled microcavity arrays. Phys Rev A 107:032417

Niu J et al (2023) Low-loss interconnects for modular superconducting quantum processors. Nat Electronics 6:235–241

Yang H, Kim NY (2023) Material-inherent noise sources in quantum information architecture. Materials 16:2561

Ripper P, Amaral G, Temporão G (2023) Swap test-based characterization of decoherence in universal quantum computers. Quantum Inform Process 22:1–14

Miller JH, Villagrán MYS, Sanderson JO, Wosik J (2023) Hybrid quantum systems for higher temperature quantum information processing. IEEE Trans Appl Superconduct 33:1–4

Espinós H, Panadero I, García-Ripoll JJ, Torrontegui E (2023) Quantum control of tunable-coupling transmons using dynamical invariants of motion. Quantum Sci Technol 8:025017

Asanovski R et al (2023) Understanding the excess 1/f noise in mosfets at cryogenic temperatures. IEEE Trans Electron Dev 70:2135

You Y, Ding Z, Zhang Y (2023) Scalable quantum computation based on nitrogen-vacancy centers in decoherence-free subspace. Int J Quantum Inform 21:2350007

Feng M et al (2023) Control of dephasing in spin qubits during coherent transport in silicon. Phys Rev B 107:085427

Liu Y et al (2023) Quantitative assessment and suppression of anharmonic potential of quadrupole linear radiofrequency ion traps with round electrodes. Phys Rev B 485:116997

Shafraniuk S (2023) Tunable spectral narrowing enabling the functionality of graphene qubit circuits at room temperature. Phys Rev B 107:045415

Irländer K, Schnack J (2023) Studies of decoherence in strongly anisotropic spin triangles with toroidal or general noncollinear easy axes. Phys Rev Res 5:013192

Chen L et al (2023) Transmon qubit readout fidelity at the threshold for quantum error correction without a quantum-limited amplifier. npj Quantum Inform 9:26

Akhtar M et al (2023) A high-fidelity quantum matter-link between ion-trap microchip modules. Nat Commun 14:531

Nakav H, Finkelstein R, Peleg L, Akerman N, Ozeri R (2023) Effect of fast noise on the fidelity of trapped-ion quantum gates. Phys Rev A 107:042622

Cai R, Žutić I, Han W (2023) Superconductor/ferromagnet heterostructures: a platform for superconducting spintronics and quantum computation. Adv Quantum Technol 6:2200080

Heußen S et al (2023) Strategies for a practical advantage of fault-tolerant circuit design in noisy trapped-ion quantum computers. Phys Rev A 107:042422

Google Quantum AI (2023) Suppressing quantum errors by scaling a surface code logical qubit. Nature 614:676–681. https://doi.org/10.1038/s41586-022-05434-1

Cenedese G, Benenti G, Bondani M (2023) Correcting coherent errors by random operation on actual quantum hardware. Entropy 25:324

Bargerbos A et al (2023) Mitigation of quasiparticle loss in superconducting qubits by phonon scattering. Phys Rev Appl 19:024014

Liao W, Suzuki Y, Tanimoto T, Ueno Y, Tokunaga Y (2023) Wit-greedy: hardware system design of weighted iterative greedy decoder for surface code. In: Proceedings of the 28th Asia and South Pacific Design Automation Conference, pp. 209–215. https://doi.org/10.1145/3566097.3567933

Zhao L-Y, Chen X-B, Xu G, Zhang J-W, Yang Y-X (2023) Fault-tolerant error correction for quantum hamming codes with only two ancillary qudits. Quantum Information Processing 22:70

Lee S-H, Omkar S, Teo YS, Jeong H (2023) Parity-encoding-based quantum computing with bayesian error tracking. npj Quantum Information. 9:39

Weinberg SJ, Sanches F, Ide T, Kamiya K, Correll R (2023) Supply chain logistics with quantum and classical annealing algorithms. Sci Rep 13:4770

Le NH, Cykiert M, Ginossar E (2023) Scalable and robust quantum computing on qubit arrays with fixed coupling. npj Quantum Information. 9:1

Mineh L, Montanaro A (2023) Accelerating the variational quantum eigensolver using parallelism. Quant Sci Technol 8:035012

Acampora G, Chiatto A, Vitiello A (2023) Genetic algorithms as classical optimizer for the quantum approximate optimization algorithm. Appl Soft Comput 142:110296

Pelofske E, Hahn G, Djidjev HN (2023) Noise dynamics of quantum annealers: estimating the effective noise using idle qubits. Quantum Sci Technol 8:035005

Pelofske E, Hahn G, Djidjev HN (2023) Solving larger maximum clique problems using parallel quantum annealing. Quantum Information Processing 22:219

Cheng X et al (2023) Optimization of the transmission cost of distributed quantum circuits based on merged transfer. Quantum Information Processing 22:187

Shi J, Wang W, Lou X, Zhang S, Li, X (2022) Parameterized hamiltonian learning with quantum circuit. IEEE Transactions on Pattern Analysis and Machine Intelligence

Jun K, Lee H (2023) Hubo formulations for solving the eigenvalue problem. Results in Control and Optimization 11:100222

Download references

Author information

Authors and affiliations.

Department of Physics, National Institute of Technology, Kurukshetra, Haryana, 136119, India

Vaishali Sood & Rishi Pal Chauhan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vaishali Sood .

Ethics declarations

Conflict of interest.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Sood, V., Chauhan, R.P. Archives of Quantum Computing: Research Progress and Challenges. Arch Computat Methods Eng 31 , 73–91 (2024). https://doi.org/10.1007/s11831-023-09973-2

Download citation

Received : 19 April 2023

Accepted : 26 June 2023

Published : 12 July 2023

Issue Date : January 2024

DOI : https://doi.org/10.1007/s11831-023-09973-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

Quantum Computing

Quantum Computing merges two great scientific revolutions of the 20th century: computer science and quantum physics. Quantum physics is the theoretical basis of the transistor, the laser, and other technologies which enabled the computing revolution. But on the algorithmic level, today's computing machinery still operates on ""classical"" Boolean logic. Quantum Computing is the design of hardware and software that replaces Boolean logic by quantum law at the algorithmic level. For certain computations such as optimization, sampling, search or quantum simulation this promises dramatic speedups. We are particularly interested in applying quantum computing to artificial intelligence and machine learning. This is because many tasks in these areas rely on solving hard optimization problems or performing efficient sampling.

Recent Publications

Some of our teams.

Applied science

We're always looking for more talented, passionate people.

Careers

What’s Next in Quantum is   quantum-centric supercomputing

A key factor in classical supercomputing is the intersection of communication and computation. The same holds true for quantum. Quantum-centric supercomputing utilizes a modular architecture to enable scaling. It combines quantum communication and computation to increase system capacity, and uses a hybrid cloud middleware to seamlessly integrate quantum and classical workflows. To realize this next wave in quantum we are building a new system called Quantum System Two. See our quantum development roadmap

  • Quantum Circuits and Software
  • Quantum Chemistry
  • Quantum Machine Learning
  • Quantum Community

research paper about quantum computers

Program real quantum systems

IBM offers cloud access to the most advanced quantum computers available. Learn, develop, and run programs with our quantum applications and systems.

How IBM Research built a lab for the future of computing

  • Accelerated Discovery
  • Hybrid Cloud
  • Semiconductors

Release news: Qiskit SDK v1.2 is here!

Ibm quantum developer conference 2024: apply today.

  • Quantum Enablement

Simulating the universe’s most extreme environments with utility-scale quantum computation

research paper about quantum computers

  • Quantum Error Correction & Mitigation
  • Quantum Research
  • Quantum Use Cases

Register to host an event at Qiskit Fall Fest 2024!

research paper about quantum computers

Defining — and citing — the Qiskit SDK

  • See more of our work on Quantum

research paper about quantum computers

Qiskit: Open-Source Quantum Development

Qiskit is an open-source SDK for working with quantum computers at the level of pulses, circuits, and application modules.

Publication collections

Acs fall 2024.

American Chemical Society (ACS) Fall Meeting

ACM/IEEE Design Automation Conference

ACS Spring 2024

American Chemical Society (ACS) Spring Meeting

APS March Meeting 2024

American Physical Society (March Meeting)

  • Quantum Error Correction
  • Quantum Finance
  • Quantum Hardware
  • Quantum Information Science

research paper about quantum computers

Collaborate with   us

IBM Quantum Network is a community of Fortune 500 companies, academic institutions, startups and national research labs working with IBM to advance quantum computing.

  • IBM Quantum Network

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Sustainability
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

Toward a code-breaking quantum computer

Press contact :.

Quantum computer

Previous image Next image

The most recent email you sent was likely encrypted using a tried-and-true method that relies on the idea that even the fastest computer would be unable to efficiently break a gigantic number into factors.

Quantum computers, on the other hand, promise to rapidly crack complex cryptographic systems that a classical computer might never be able to unravel. This promise is based on a quantum factoring algorithm proposed in 1994 by Peter Shor , who is now a professor at MIT.

But while researchers have taken great strides in the last 30 years, scientists have yet to build a quantum computer powerful enough to run Shor’s algorithm.

As some researchers work to build larger quantum computers, others have been trying to improve Shor’s algorithm so it could run on a smaller quantum circuit. About a year ago, New York University computer scientist Oded Regev proposed a  major theoretical improvement . His algorithm could run faster, but the circuit would require more memory.

Building off those results, MIT researchers have proposed a best-of-both-worlds approach that combines the speed of Regev’s algorithm with the memory-efficiency of Shor’s. This new algorithm is as fast as Regev’s, requires fewer quantum building blocks known as qubits, and has a higher tolerance to quantum noise, which could make it more feasible to implement in practice.

In the long run, this new algorithm could inform the development of novel encryption methods that can withstand the code-breaking power of quantum computers.

“If large-scale quantum computers ever get built, then factoring is toast and we have to find something else to use for cryptography. But how real is this threat? Can we make quantum factoring practical? Our work could potentially bring us one step closer to a practical implementation,” says Vinod Vaikuntanathan, the Ford Foundation Professor of Engineering, a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL), and senior author of a paper describing the algorithm .

The paper’s lead author is Seyoon Ragavan, a graduate student in the MIT Department of Electrical Engineering and Computer Science. The research will be presented at the 2024 International Cryptology Conference.

Cracking cryptography

To securely transmit messages over the internet, service providers like email clients and messaging apps typically rely on RSA, an  encryption scheme invented by MIT researchers Ron Rivest, Adi Shamir, and Leonard Adleman in the 1970s (hence the name “RSA”). The system is based on the idea that factoring a 2,048-bit integer (a number with 617 digits) is too hard for a computer to do in a reasonable amount of time.

That idea was flipped on its head in 1994 when Shor, then working at Bell Labs, introduced an algorithm which proved that a quantum computer could factor quickly enough to break RSA cryptography.

“That was a turning point. But in 1994, nobody knew how to build a large enough quantum computer. And we’re still pretty far from there. Some people wonder if they will ever be built,” says Vaikuntanathan.

It is estimated that a quantum computer would need about 20 million qubits to run Shor’s algorithm. Right now, the largest quantum computers have around 1,100 qubits.

A quantum computer performs computations using quantum circuits, just like a classical computer uses classical circuits. Each quantum circuit is composed of a series of operations known as quantum gates. These quantum gates utilize qubits, which are the smallest building blocks of a quantum computer, to perform calculations.

But quantum gates introduce noise, so having fewer gates would improve a machine’s performance. Researchers have been striving to enhance Shor’s algorithm so it could be run on a smaller circuit with fewer quantum gates.

That is precisely what Regev did with the circuit he proposed a year ago.

“That was big news because it was the first real improvement to Shor’s circuit from 1994,” Vaikuntanathan says.

The quantum circuit Shor proposed has a size proportional to the square of the number being factored. That means if one were to factor a 2,048-bit integer, the circuit would need millions of gates.

Regev’s circuit requires significantly fewer quantum gates, but it needs many more qubits to provide enough memory. This presents a new problem.

“In a sense, some types of qubits are like apples or oranges. If you keep them around, they decay over time. You want to minimize the number of qubits you need to keep around,” explains Vaikuntanathan.

He heard Regev speak about his results at a workshop last August. At the end of his talk, Regev posed a question: Could someone improve his circuit so it needs fewer qubits? Vaikuntanathan and Ragavan took up that question.

Quantum ping-pong

To factor a very large number, a quantum circuit would need to run many times, performing operations that involve computing powers, like 2 to the power of 100.

But computing such large powers is costly and difficult to perform on a quantum computer, since quantum computers can only perform reversible operations. Squaring a number is not a reversible operation, so each time a number is squared, more quantum memory must be added to compute the next square.

The MIT researchers found a clever way to compute exponents using a series of  Fibonacci numbers that requires simple multiplication, which is reversible, rather than squaring. Their method needs just two quantum memory units to compute any exponent.

“It is kind of like a ping-pong game, where we start with a number and then bounce back and forth, multiplying between two quantum memory registers,” Vaikuntanathan adds.

They also tackled the challenge of error correction. The circuits proposed by Shor and Regev require every quantum operation to be correct for their algorithm to work, Vaikuntanathan says. But error-free quantum gates would be infeasible on a real machine.

They overcame this problem using a technique to filter out corrupt results and only process the right ones.

The end-result is a circuit that is significantly more memory-efficient. Plus, their error correction technique would make the algorithm more practical to deploy.

“The authors resolve the two most important bottlenecks in the earlier quantum factoring algorithm. Although still not immediately practical, their work brings quantum factoring algorithms closer to reality,” adds Regev.

In the future, the researchers hope to make their algorithm even more efficient and, someday, use it to test factoring on a real quantum circuit.

“The elephant-in-the-room question after this work is: Does it actually bring us closer to breaking RSA cryptography? That is not clear just yet; these improvements currently only kick in when the integers are much larger than 2,048 bits. Can we push this algorithm and make it more feasible than Shor’s even for 2,048-bit integers?” says Ragavan.

This work is funded by an Akamai Presidential Fellowship, the U.S. Defense Advanced Research Projects Agency, the National Science Foundation, the MIT-IBM Watson AI Lab, a Thornton Family Faculty Research Innovation Fellowship, and a Simons Investigator Award.

Share this news article on:

Related links.

  • Vinod Vaikuntanathan
  • Computer Science and Artificial Intelligence Laboratory
  • Department of Electrical Engineering and Computer Science

Related Topics

  • Quantum computing
  • Cryptography
  • Cybersecurity
  • Computer science and technology
  • Computer Science and Artificial Intelligence Laboratory (CSAIL)
  • Electrical Engineering & Computer Science (eecs)
  • Defense Advanced Research Projects Agency (DARPA)
  • National Science Foundation (NSF)

Related Articles

Peter Shor stands at a microphone, speaking, with an MIT banner behind him.

It’s a weird, weird quantum world

A hand icon clicks on a search box. The search box has garbled letters and a lock icon. Background has medical icons.

A faster way to preserve privacy online

Quantum chip graphic

Explained: Quantum engineering

Abstract drawing of a key on a circuit board

Helping companies prioritize their cybersecurity investments

Previous item Next item

More MIT News

Five square slices show glimpse of LLMs, and the final one is green with a thumbs up.

Study: Transparency is often lacking in datasets used to train large language models

Read full story →

Charalampos Sampalis wears a headset while looking at the camera

How MIT’s online resources provide a “highly motivating, even transformative experience”

A small model shows a wooden man in a sparse room, with dramatic lighting from the windows.

Students learn theater design through the power of play

Illustration of 5 spheres with purple and brown swirls. Below that, a white koala with insets showing just its head. Each koala has one purple point on either the forehead, ears, and nose.

A framework for solving parabolic partial differential equations

Feyisayo Eweje wears lab coat and gloves while sitting in a lab.

Designing better delivery for medical therapies

Saeed Miganeh poses standing in a hallway. A street scene is visible through windows in the background

Making a measurable economic impact

  • More news on MIT News homepage →

Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA

  • Map (opens in new window)
  • Events (opens in new window)
  • People (opens in new window)
  • Careers (opens in new window)
  • Accessibility
  • Social Media Hub
  • MIT on Facebook
  • MIT on YouTube
  • MIT on Instagram

Electrical and Computer Engineering

College of engineering.

Digital code on a blue background

Preparing for Quantum Computing

By Giordana Verrengia

  • kristab(through)cmu.edu

While quantum computers are prototypical as of today, a security measure called post-quantum cryptography (PQC) is already in use — some notable examples being the Google Chrome browser and the internet giant Cloudflare.

Researchers from Carnegie Mellon University, Graz University of Technology in Austria, and Tallinn University of Technology in Estonia have collaborated to identify vulnerabilities in PQC. Their work — which looks at Dilithium, an electronic signature algorithm — is part of a concerted effort among industry professionals to beat the clock and develop a reliable PQC algorithm before quantum computers become readily available at least 10 years down the line.

Sam Pagliarini , a special professor of electrical and computer engineering, says there are key differences between applications of classical and quantum computers. The classical devices we use now, like laptops and desktops, will not be replaced. Quantum computers — which are designed to excel at complex calculations — will be used almost exclusively for research purposes in higher education and government settings to solve problems related to mathematics, physics, and chemistry.  

Given that quantum devices will be hard to access, why is post-quantum cryptography so important, and why is it currently in use?

Because of a tactic called “store now, decrypt later”: Hackers harvest encrypted data in hopes of acquiring the necessary decryption tools later. Data can be swiped from a classical device and decrypted later with a quantum computer, underscoring the need for industry and government figures to work ahead and introduce a standardized PQC algorithm well before the devices are built.

“PQC isn’t science fiction. It’s serious in the sense that the US government has a mandate in place for every federal agency to switch to a form of communication that is secure against quantum computers . For some, the deadline is as soon as 2025,” Pagliarini says. 

One way to test if PQC algorithms are up to the challenge involves ethical hacking. Pagliarini and his fellow researchers created an algorithm called REPQC to identify any security vulnerabilities when Dilithium is implemented as a computer chip. Dilithium’s lattice-based algorithm structure is important to probe because it was chosen by the National Institute of Standards and Technology for standardization as experts work to advance PQC. Using reverse engineering, the team inserted a hardware trojan horse (HTH) that used reverse engineering to locate where sensitive data was stored on the hardware accelerator. The team developed additional circuitry that leaked a secret key, which decrypts data and could be used to forge signatures.

“My entire motivation is to find weak spots and bring attention to them,” says Pagliarini. “This research is mostly about protection against a new class of devices, quantum computers, while not losing sight of threats that exist today, such as reverse engineering.”

The multi-university team’s paper, “REPQC: Reverse Engineering and Backdooring Hardware Accelerators for Post-quantum Cryptography,” was accepted to the prestigious 19th ACM ASIA Conference on Computer and Communications Security taking place in Singapore from July 1-5, 2024.

Related People

  • Sam Pagliarini
  • Official White House Announcement

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

research paper about quantum computers

  • ORNL: Study Seeks to Unite HPC and Quantum Computing for Science

August 29, 2024

Aug. 29, 2024 — A  study  by more than a dozen scientists at the Department of Energy’s Oak Ridge National Laboratory examines potential strategies to integrate quantum computing with the world’s most powerful supercomputing systems in the pursuit of science.

research paper about quantum computers

The study published in  Future Generation Computing Systems  takes a big-picture look at the states of  quantum computing  and classical high-performance computing, or HPC, and describes a potential framework for boosting traditional scientific HPC by leveraging the quantum approach.

“It’s kind of a manifesto for how we propose to dive as a laboratory into this new era of computing,” said co-author Rafael Ferreira da Silva, a senior research scientist for ORNL’s National Center for Computational Sciences, or NCCS. “Our approach won’t be the only right way, but we think it will be a useful one that builds on ORNL’s legacy as a leader in supercomputing and that we can adapt as technology evolves and the next generation of computing takes shape.”

ORNL serves as home to the Oak Ridge Leadership Computing Facility, or OLCF, which houses Frontier, the world’s fastest supercomputer, and to the OLCF  Quantum Computing User Program , which awards time on privately owned quantum processors around the country to support independent quantum study. The laboratory also leads the DOE’s  Quantum Science Center , a national Quantum Information Science Research Center, which combines resources and expertise from national laboratories, universities and industry partners to investigate quantum computing, quantum sensing and quantum materials.

“We have a vast amount of experience here at ORNL in standing up classical supercomputers, dating back more than 20 years,” said Tom Beck, the study’s lead author, who oversees the NCCS Science Engagement Section. “How can we apply that experience and maintain that momentum as we explore this new quantum domain?”

Classical computers store information in bits equal to either 0 or 1. In other words, a classical bit, like a light switch, exists in one of two states: on or off. That binary dynamic doesn’t necessarily fit some complex scientific problems.

“We encounter certain problems in science in which electrons, for example, are coupled between atoms in ways that grow exponentially when we try to model them on a classical computer,” Beck said. “We can adjust formulas and try to tackle those problems in an abbreviated fashion, but we can’t even begin to hope to solve them on a classical computer. The necessary equations and computations are just too complex.”

Quantum computing uses the laws of quantum mechanics to store information in qubits, the quantum equivalent of bits. Qubits can exist in more than one state simultaneously via quantum superposition, which allows qubits to carry more information than classical bits.

Quantum superposition allows a qubit to exist in two possible states at the same time, similar to a spinning coin — neither heads nor tails for the coin, neither one value nor the other for the qubit. Measuring the value of the qubit determines the probability of measuring either of the two possible values, similar to stopping the coin on heads or tails. That dynamic allows for a wider range of possible values, more like a dial with precise settings than a binary on-off switch.

“The quantum aspect allows us to represent the problem in a more efficient way and potentially opens up a new way to solve problems that we couldn’t before,” Beck said.

Scientists haven’t yet settled on the most effective technology for encoding qubits, and high error rates remain an obstacle to harnessing quantum computing’s potential. The study proposes developing quantum test beds to explore the various technologies and coupling those test beds with classical machines.

“We don’t want to tie ourselves to any single technology yet because we don’t know what approach will emerge as the best,” Beck said. “But while we’re in this early stage, we need to begin incorporating quantum elements into our computing infrastructure with an eye toward potential breakthroughs. Ultimately, we want to connect these two vastly different types of computers in a seamless way to run the machines together — similar to the hybrid architecture of graphics processing units, or GPUs, and central processing units, or CPUs, that accelerates current leadership-class supercomputers.”

That hybrid architecture, used by supercomputers such as Frontier, integrates the two kinds of processors on each node for the fastest possible computing — GPUs for the repetitive calculations that make up the backbone of most simulations and CPUs for higher-level tasks such as retrieving information and executing other instructions. The technology needed for classical and quantum processors to share space on a node doesn’t yet exist.

The study recommends a high-speed network as the best way to connect classical HPC resources with quantum computers for now.

“There are degrees of integration, and we won’t achieve the ideal right away,” said ORNL’s Sarp Oral, who oversees the NCCS Advanced Technologies Section. “To achieve that ideal, we need to identify which algorithms and applications can take advantage of quantum computing. Our job is to provide better ways to conduct science, and quantum computing can be a tool that serves that purpose.”

Support for this research came from the DOE Office of Science’s Advanced Scientific Computing Research program, the Quantum Science Center, the OLCF, the OLCF’s Quantum Computing User Program and the Department of Defense’s Defense Advanced Research Projects Agency. The OLCF is an Office of Science user facility.

UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science .

Source : Matt Lakin, ORNL

Leading Solution Providers

Altair

Off The Wire

Industry headlines, august 30, 2024.

  • NSF Awards $1M Grant to New Mexico Universities for Quantum Photonics Computer Research
  • E4 Computer Engineering Joins QuEra Quantum Alliance as Founding Member
  • NVIDIA Announces Financial Results for 2nd Quarter Fiscal 2025
  • Scala to Enhance Hyperscale Data Centers’ Renewable Energy Supply with Wind Power from Brazilian Farms, Starting 2025
  • GlobalFoundries and Efficient Collaborate to Advance Energy-Efficient Computing for Edge Devices
  • US AI Safety Institute Collaborates with Anthropic and OpenAI on AI Safety Research
  • Intel and IBM Deliver Enterprise AI in the Cloud
  • HPE Delivers Energy-Efficient Iridis 6 HPC System to University of Southampton
  • NSF Invests $39M in Quantum Research to Expand Capacity Across US Institutions
  • Supermicro Previews New Max Performance Intel-based X14 Servers for AI and HPC Workloads
  • AWS Parallel Computing Service Now Available for Scalable HPC Workloads

August 28, 2024

  • New MLPerf Inference v4.1 Benchmark Results Highlight Rapid Innovations in GenAI Systems
  • AMD Achieves Strong MLPerf Inference Results with Instinct MI300X GPUs
  • Intel Xeon 6 Demonstrates Enhanced AI Inference Capabilities in MLPerf Testing
  • CIQ Empowers Researchers to Innovate Faster with Fuzzball
  • NVIDIA Blackwell Joins MLPerf Inference with Upgraded Performance Metrics
  • CoreWeave Leads Cloud Market with Launch of NVIDIA H200 Tensor Core GPUs
  • Untether AI Reports High Throughput for PCIe Cards in MLPerf Benchmarks
  • Fujitsu Partners with Osaka University to Advance Quantum Computing Capabilities

More Off The Wire

Off The Wire Industry Headlines

Subscribe to hpcwire's weekly update.

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

  • Editor’s Picks
  • Most Popular

Shutterstock 1622080153

AWS Perfects Cloud Service for Supercomputing Customers

Amazon's AWS believes it has finally created a cloud service that will break through with HPC and supercomputing customers. The cloud provider announced the commercial availability of Parallel Computing S Read more…

Shutterstock 1453953692

AI Delivers Swifter Simulations for Modern Science

(Note: The original story is reproduced  here with permission from Sandia Labs.) A good machine-learning algorithm is a powerful research accelerator. Pair it with a computer simulation and it can sniff out mathemati Read more…

research paper about quantum computers

Cricket and HPC: Far, Yet Close

Once the Paris Olympics ended, I turned my attention to the list of sports in the 2028 Los Angeles Olympics. It included cricket. I am a passive cricket enthusiast, but I also thought, "Satya Nadella, Sundar Pichai, a Read more…

research paper about quantum computers

HPC Debrief: James Walker CEO of NANO Nuclear Energy on Powering Datacenters

August 27, 2024

Welcome to The HPC Debrief where we interview industry leaders that are shaping the future of HPC. As the growth of AI continues, finding power for data centers is going to be a major challenge. Nuclear energy will almos Read more…

research paper about quantum computers

CEO Q&A: Acceleration is Quantinuum’s New Mantra for Success

At the Quantum World Congress (QWC) in mid-September, trapped ion quantum computing pioneer Quantinuum will unveil more about its expanding roadmap. Its current state of the art system, H2-1, has 56 qubits, and is no lon Read more…

Shutterstock_2206622211

AMD’s AI Plan: The Nvidia Killer or a Wasted Effort?

August 26, 2024

An AMD call to discuss its $4.9 billion acquisition of ZT Systems provided an inside look into how Lisa Su is building her AI empire. She laid down an AMD AI landscape that is polar opposite to Nvidia's proprietary appro Read more…

Amazon's AWS believes it has finally created a cloud service that will break through with HPC and supercomputing customers. The cloud provider a Read more…

Welcome to The HPC Debrief where we interview industry leaders that are shaping the future of HPC. As the growth of AI continues, finding power for data centers Read more…

At the Quantum World Congress (QWC) in mid-September, trapped ion quantum computing pioneer Quantinuum will unveil more about its expanding roadmap. Its current Read more…

An AMD call to discuss its $4.9 billion acquisition of ZT Systems provided an inside look into how Lisa Su is building her AI empire. She laid down an AMD AI la Read more…

research paper about quantum computers

Breaking Down Global Government Spending on AI

Governments are scrambling to stay ahead of the AI tsunami, and for good reason. Like any other useful technology, AI presents a gigantic economic opportunity f Read more…

research paper about quantum computers

Under the Wire: Nearly HPC News (Aug 22)

August 22, 2024

There has been a bit of delay after our initial launch of Under the Wire. We decided to create some cutting-edge graphics, produce a concurrent video blog, and Read more…

research paper about quantum computers

OpenFold Advances Protein Modeling with AI and Supercomputing Power

Proteins, life’s building blocks, perform a wide range of functions based on their unique shapes. The molecules fold into specific forms and shapes that defin Read more…

research paper about quantum computers

Is the GenAI Bubble Finally Popping?

August 21, 2024

Doubt is creeping into discussion over generative AI, as industry analysts begin to publicly question whether the huge investments in GenAI will ever pay off. T Read more…

research paper about quantum computers

Everyone Except Nvidia Forms Ultra Accelerator Link (UALink) Consortium

May 30, 2024

Consider the GPU. An island of SIMD greatness that makes light work of matrix math. Originally designed to rapidly paint dots on a computer monitor, it was then Read more…

research paper about quantum computers

Atos Outlines Plans to Get Acquired, and a Path Forward

May 21, 2024

Atos – via its subsidiary Eviden – is the second major supercomputer maker outside of HPE, while others have largely dropped out. The lack of integrators and Atos' financial turmoil have the HPC market worried. If Atos goes under, HPE will be the only major option for building large-scale systems. Read more…

research paper about quantum computers

AMD Clears Up Messy GPU Roadmap, Upgrades Chips Annually

June 3, 2024

In the world of AI, there's a desperate search for an alternative to Nvidia's GPUs, and AMD is stepping up to the plate. AMD detailed its updated GPU roadmap, w Read more…

research paper about quantum computers

Nvidia Shipped 3.76 Million Data-center GPUs in 2023, According to Study

June 10, 2024

Nvidia had an explosive 2023 in data-center GPU shipments, which totaled roughly 3.76 million units, according to a study conducted by semiconductor analyst fir Read more…

research paper about quantum computers

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock_1687123447

Nvidia Economics: Make $5-$7 for Every $1 Spent on GPUs

June 30, 2024

Nvidia is saying that companies could make $5 to $7 for every $1 invested in GPUs over a four-year period. Customers are investing billions in new Nvidia hardwa Read more…

research paper about quantum computers

Google Announces Sixth-generation AI Chip, a TPU Called Trillium

May 17, 2024

On Tuesday May 14th, Google announced its sixth-generation TPU (tensor processing unit) called Trillium.  The chip, essentially a TPU v6, is the company's l Read more…

research paper about quantum computers

Intel’s Next-gen Falcon Shores Coming Out in Late 2025 

April 30, 2024

It's a long wait for customers hanging on for Intel's next-generation GPU, Falcon Shores, which will be released in late 2025.  "Then we have a rich, a very Read more…

Contributors

Tiffany Trader

Tiffany Trader

Editorial director.

Douglas Eadline

Douglas Eadline

Managing editor.

John Russell

John Russell

Senior editor.

Kevin Jackson

Kevin Jackson

Contributing editor.

Ali Azhar

Alex Woodie

Addison Snell

Addison Snell

Drew Jolly

Assistant Editor

research paper about quantum computers

IonQ Plots Path to Commercial (Quantum) Advantage

July 2, 2024

IonQ, the trapped ion quantum computing specialist, delivered a progress report last week firming up 2024/25 product goals and reviewing its technology roadmap. Read more…

research paper about quantum computers

Some Reasons Why Aurora Didn’t Take First Place in the Top500 List

May 15, 2024

The makers of the Aurora supercomputer, which is housed at the Argonne National Laboratory, gave some reasons why the system didn't make the top spot on the Top Read more…

research paper about quantum computers

The NASA Black Hole Plunge

May 7, 2024

We have all thought about it. No one has done it, but now, thanks to HPC, we see what it looks like. Hold on to your feet because NASA has released videos of wh Read more…

research paper about quantum computers

How the Chip Industry is Helping a Battery Company

May 8, 2024

Chip companies, once seen as engineering pure plays, are now at the center of geopolitical intrigue. Chip manufacturing firms, especially TSMC and Intel, have b Read more…

research paper about quantum computers

Illinois Considers $20 Billion Quantum Manhattan Project Says Report

There are multiple reports that Illinois governor Jay Robert Pritzker is considering a $20 billion Quantum Manhattan-like project for the Chicago area. Accordin Read more…

research paper about quantum computers

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

research paper about quantum computers

Top 500: Aurora Breaks into Exascale, but Can’t Get to the Frontier of HPC

May 13, 2024

The 63rd installment of the TOP500 list is available today in coordination with the kickoff of ISC 2024 in Hamburg, Germany. Once again, the Frontier system at Read more…

research paper about quantum computers

Spelunking the HPC and AI GPU Software Stacks

June 21, 2024

As AI continues to reach into every domain of life, the question remains as to what kind of software these tools will run on. The choice in software stacks – Read more…

arrow

  • Click Here for More Headlines

The Information Nexus of Advanced Computing and Data systems for a High Performance World

  • Our Publications
  • Live Events
  • Privacy Policy
  • Cookie Policy
  • About Tabor Communications
  • Update Subscription Preferences
  • California Consumers

© 2024 HPCwire. All Rights Reserved. A Tabor Communications Publication

HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.

Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.

Privacy Overview

Copy short link.

Quantum Computing Report Logo

UNM Secures $1M NSF Grant to Pave the Way for Room-Temperature Photonic Quantum Computing

research paper about quantum computers

The University of New Mexico (UNM) and New Mexico State University have been awarded a $1 million grant from the National Science Foundation (NSF) to develop a photonic quantum computer capable of operating at room temperature. This project, part of the NSF’s National Quantum Virtual Laboratory (NQVL) initiative , is one of five nationwide to receive funding. The research team, led by UNM’s Marek Osinski and several other distinguished professors, aims to create a fully integrated quantum computing chip utilizing Gaussian boson sampling and photonics.

Traditional quantum computers require extremely low temperatures to function, but this research explores the potential of photonic quantum computers to operate at room temperature, making the technology more accessible and practical. The first year of research will focus on key components, such as electrically pumped quantum dots and dynamically biased avalanche photodiodes, to advance the development of the computer.

The project also emphasizes workforce development and educational opportunities, including the creation of a Quantum Science and Engineering graduate program at UNM and quantum education for local community colleges. The initiative aims to foster local business development in Albuquerque and establish the region as a hub for photonic quantum computing.

For additional information, you can access a press release provided UNM here .

August 29, 2024

Leave A Comment Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed .

research paper about quantum computers

Duke-led Team Building Next-Gen 256-Qubit Quantum Computer Funded by $1M Grant from NSF

  • Capital Markets , Quantum Computing Business , Research

Cierra Choucair

August 30, 2024.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Duke University Pratt School of Engineering logo

Insider Brief:

  • Researchers at Duke received a $1 million NSF grant to begin building a a 256-qubit quantum computer, with plans to complete by 2026.
  • The project builds on previous quantum research, including the Software-Tailored Architectures for Quantum co-design (STAQ) and Enabling Practical-scale Quantum Computing (EPiQC) initiatives.
  • This is one of five pilot projects working towards the NSF’s National Quantum Virtual Laboratory program to accelerate quantum technology development and encourage workforce growth.

PRESS RELEASE —Researchers at the Duke Quantum Center are embarking down an ambitious road to engineer the most powerful quantum computer in the world. If all goes to plan, the collaboration could become the first to demonstrate a quantum system that can outperform a classical computer for a range of scientific applications.

The effort was recently set in motion by a one-year, $1 million grant from the National Science Foundation’s National Quantum Virtual Laboratory (NQVL) program. Along with the pilot program headed by Duke, NQVL is funding four other pioneering projects designed to enable faster discovery and development of use-inspired quantum technologies. 

“The NSF National Quantum Virtual Laboratory represents a new approach NSF is taking to facilitate the complex and multistep process of translating new scientific ideas into fully developed technologies that benefit society,” says acting NSF Assistant Director for Mathematical and Physical Sciences Denise Caldwell.

Called the Quantum Advantage-Class Trapped Ion system (QACTI), the quantum computer being pursued by the Duke-led group builds on foundations laid through the Software-Tailored Architectures for Quantum co-design (STAQ) project. Originally funded with $15 million by the NSF in 2018, STAQ was recently renewed in 2024 with an additional $17 million in funding through 2029. It also builds on work conducted through the University of Chicago’s Enabling Practical-scale Quantum Computing (EPiQC) Expedition, of which Duke is also a part.

Responsive Image

These grants have supported researchers working to build quantum computers through qubits made from trapped ions. Each qubit is analogous to a single bit in a classical computer, but with much more dexterity and power. Because qubits can be both a 1 or a 0 at the same time, they have the potential to solve complex problems that classical computers never could.

“The STAQ program team has already tested many ion trap designs and controls systems, achieving control of a chain 23 qubits long with plans to soon reach more than 50,” said Ken Brown, the Michael J. Fitzpatrick Distinguished Professor of Engineering at Duke and the leader of the new QACTI program. “This new funding will allow us to work with the broader community to identify technical challenges and potential solutions with the goal of starting construction of a 256 qubit machine in 2026.”

Joining Duke in this quest are original STAQ collaborators from the University of Chicago, North Carolina State University and Tufts University. The NQVL project also includes new collaborations with North Carolina Agricultural and Technical State University to strengthen their expertise on control systems and to help shape the group’s workforce development plans.

While the NQVL grant may seem small when compared to previous quantum-related grants, it holds the potential to grow into a much larger effort than STAQ. After the first phase is complete, participants have the opportunity to proceed to future funding rounds focused on design and implementation and a potential increase in budget up to $10 million per year.

As these five projects mature alongside another five expected to be announced later this year, those chosen for additional funding will grow together and serve as a federated resource, bringing together assets that will enable a diversity of quantum-focused research and development.

NQVL will broaden access to specialized research infrastructure by functioning as a geographically distributed national resource. NQVL will grow and adapt to seize emerging opportunities and accelerate the translation of fundamental science and engineering into practical applications codesigned by a broad and diverse user community that spans computing, networking and sensing.

“U.S. competitiveness hinges on accelerating the translation of technological innovations into the market and society, as well as training the American workforce for the jobs of tomorrow,” said Erwin Gianchandani, NSF assistant director for Technology, Innovation and Partnerships. “Through NQVL, NSF will invest in resources that will allow for research and experimentation of novel quantum technologies, opening new opportunities across a range of disciplines from new material discovery to health care interventions, all while providing critical workforce development opportunities to fill the quantum jobs anticipated over the next decade.”

SOURCE: https://pratt.duke.edu/news/duke-nqvl-grant/

research paper about quantum computers

Quantum Machine Learning Is The Next Big Thing

Quantum Computing Research Universities

12 Top Quantum Computing Universities in 2024

Sifting through the Clouds: Polish Researchers Will Test the Utility of Quantum Algorithms for Satellite Imagery

Sifting through the Clouds: Polish Researchers Will Test the Utility of Quantum Algorithms for Satellite Imagery

research paper about quantum computers

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

IBM Think Lab Quantum System One and meeting place.

A Blueprint for R&D: IBM Think Lab Fuses Fluid Design with Interdisciplinary Discovery

September 1, 2024.

Optical Illusions

The Conversation: A Quantum Neural Network Can See Optical Illusions Like Humans Do. Could it be The Future of AI?

Matt swayne.

research paper about quantum computers

Agnostiq’s William Cunningham on the Future of Quantum & AI Integration

James dargan, august 31, 2024.

Rigetti

Oxford Instruments NanoScience Joins Rigetti’s Novera QPU Partner Program

HEMEX Sapphire - Crystal Systems

Improved Performance of Superconducting Qubits Makes Investigation of Sapphire Substrates Compelling as an Alternative to Silicon

December 14, 2023.

research paper about quantum computers

Novo Nordisk Foundation Helping to Build Denmark’s First Quantum Computer

Some of the many researchers and students NSF is supporting through the 2024 awards from the Expanding Capacity in Quantum Information Science and Engineering (ExpandQISE) program. Credit: Graphic by U.S. National Science Foundation Images left to right: Northern Arizona University, Massachusetts Institute of Technology, Missouri University of Science and Technology, USD Photo Services, Miami University

NSF’s ExpandQISE Program Accelerates Quantum Research and Workforce Nationwide with $39 Million Allocation Across 23 U.S. Institutions

research paper about quantum computers

India’s DRDO Scientists Complete Testing of 6-Qubit Superconducting Quantum Processor

August 29, 2024.

  • Necessary These cookies are not optional. They are needed for the website to function.
  • Statistics In order for us to improve the website's functionality and structure, based on how the website is used.
  • Experience In order for our website to perform as well as possible during your visit. If you refuse these cookies, some functionality will disappear from the website.
  • Marketing By sharing your interests and behavior as you visit our site, you increase the chance of seeing personalized content and offers.

Welcome to our weekly QC newsletter. Yes, we know we are The Quantum Insider but we also appreciate that you probably don’t want us in your inbox every day. Here is what we have been working on this week.

You can unsubscribe anytime. For more details, review our Privacy Policy.

You have successfully joined our subscriber list.

One of our team will be in touch to learn more about your requirements, and provide pricing and access options.

Join Our Newsletter

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Perspective
  • Published: 15 September 2022

Challenges and opportunities in quantum machine learning

  • M. Cerezo   ORCID: orcid.org/0000-0002-2757-3170 1 , 2 , 3 ,
  • Guillaume Verdon   ORCID: orcid.org/0000-0001-6583-5760 4 , 5 , 6 ,
  • Hsin-Yuan Huang 7 , 8 ,
  • Lukasz Cincio 3 , 9 &
  • Patrick J. Coles   ORCID: orcid.org/0000-0001-9879-8425 3 , 9  

Nature Computational Science volume  2 ,  pages 567–576 ( 2022 ) Cite this article

17k Accesses

126 Citations

63 Altmetric

Metrics details

  • Computational science
  • Information theory and computation
  • Quantum information

At the intersection of machine learning and quantum computing, quantum machine learning has the potential of accelerating data analysis, especially for quantum data, with applications for quantum materials, biochemistry and high-energy physics. Nevertheless, challenges remain regarding the trainability of quantum machine learning models. Here we review current methods and applications for quantum machine learning. We highlight differences between quantum and classical machine learning, with a focus on quantum neural networks and quantum deep learning. Finally, we discuss opportunities for quantum advantage with quantum machine learning.

You have full access to this article via your institution.

Similar content being viewed by others

research paper about quantum computers

The power of quantum neural networks

research paper about quantum computers

Training deep quantum neural networks

research paper about quantum computers

QDataSet, quantum datasets for machine learning

The recognition that the world is quantum mechanical has allowed researchers to embed well established, but classical, theories into the framework of quantum Hilbert spaces. Shannon’s information theory, which is the basis of communication technology, has been generalized to quantum Shannon theory (or quantum information theory), opening up the possibility that quantum effects could make information transmission more efficient 1 . The field of biology has been extended to quantum biology to allow for a deeper understanding of biological processes such as photosynthesis, smell and enzyme catalysis 2 . Turing’s theory of universal computation has been extended to universal quantum computation 3 , potentially leading to exponentially faster simulations of physical systems.

One of the most successful technologies of this century is machine learning (ML), which aims to classify, cluster and recognize patterns for large datasets. Learning theory has been simultaneously developed alongside of ML technology to understand and improve upon its success. Concepts such as support vector machines, neural networks and generative adversarial networks have impacted science and technology in profound ways. ML is now ingrained into society to such a degree that any fundamental improvement to ML leads to tremendous economic benefit.

Similarly to other classical theories, ML and learning theory can in fact be embedded into the quantum-mechanical formalism. Formally speaking, this embedding leads to the field known as quantum machine learning (QML) 4 , 5 , 6 , which aims to understand the ultimate limits of data analysis allowed by the laws of physics. Practically speaking, the advent of quantum computers, with the hope of achieving a so-called quantum advantage (as defined below) for data analysis, is what has made QML so exciting. Quantum computing exploits entanglement, superposition and interference to perform certain tasks with substantial speedups over classical computing, sometimes even exponentially faster. Indeed, while such speedup has already been observed for a contrived problem 7 , reaching it for data science is still uncertain even at the theoretical level, but this is one of the main goals for QML.

In practice, QML is a broad term that encompasses all of the tasks shown in Fig. 1 . For example, ML can be applied to quantum applications such as discovering quantum algorithms 8 or optimizing quantum experiments 9 , 10 , or a quantum neural network (QNN) can be used to process either classical or quantum information 11 . Even classical tasks can be viewed as QML when they are quantum inspired 12 . We note that the focus of this Perspective will be on QNNs, quantum deep learning and quantum kernels, even though the field of QML is quite broad and goes beyond these topics.

figure 1

QML is usually considered for four main tasks. These include tasks where the data are either classical or quantum, and where the algorithm is either classical or quantum. Top left: tensor networks are quantum-inspired classical methods that can analyze classical data. Top right: unitary time-evolution data U from a quantum system can be classically compiled into a quantum circuit. Bottom left: handwritten digits can be mapped to quantum states for classification on a quantum computer. Bottom right: molecular ground-state data can be classified directly on a quantum computer. The figure shows the dependence of ground-state energy E on the distance d between the atoms.

After the invention of the laser, it was called a solution in search of a problem. To some degree, the situation with QML is similar. The complete list of applications of QML is not fully known. Nevertheless, it is possible to speculate that all the areas shown in Fig. 2 will be impacted by QML. For example, QML will likely benefit chemistry, materials science, sensing and metrology, classical data analysis, quantum error correction and quantum algorithm design. Some of these applications produce data that are inherently quantum mechanical, and hence it is natural to apply QML (rather than classical ML) to them.

figure 2

QML has been envisioned to bring a computational advantage in many applications. QML can enhance quantum simulation for chemistry (for example, molecular ground states 110 , equilibrium states 47 and time evolution 112 ) and materials science (for example, quantum phase recognition 11 and generative design with a target property in mind 130 ). QML can enhance quantum computing by learning quantum error correction codes 11 , 109 and syndrome decoders, performing quantum control, learning to mitigate errors and compiling and optimizing quantum circuits. QML can enhance sensing and metrology 46 , 104 , 105 , 106 , 107 and extract hidden parameters from quantum systems. Finally, QML may speed up classical data analysis, including clustering and classification.

While there are similarities between classical and quantum ML, there are also some differences. Because QML employs quantum computers, noise from these computers can be a major issue. This includes hardware noise such as decoherence as well as statistical noise (that is, shot noise) that arises from measurements on quantum states. Both of these noise sources can complicate the QML training process. Moreover, nonlinear operations (for example, neural activation functions) that are natural in classical ML require more careful design of QML models due to the linearity of quantum transformations.

For the field of QML, the immediate goal for the near future is demonstrating quantum advantage, that is, outperforming classical methods, in a data science application. Achieving this goal will require keeping an open mind about which applications will benefit most from QML (for example, it may be an application that is inherently quantum mechanical). Understanding how QML methods scale to large problem sizes will also be required, including analysis of trainability (gradient scaling) and prediction error. The availability of high-quality quantum hardware 13 , 14 will also be crucial.

Finally, we note that QML provides a new way of thinking about established fields, such as quantum information theory, quantum error correction and quantum foundations. Viewing such applications from a data science perspective will likely lead to new breakthroughs.

As shown in Fig. 3 , QML can be used to learn from either classical or quantum data, and thus we begin by contrasting these two types of data. Classical data are ultimately encoded in bits, each of which can be in a 0 or 1 state. This includes images, texts, graphs, medical records, stock prices, properties of molecules, outcomes from biological experiments and collision traces from high-energy physics experiments. Quantum data are encoded in quantum bits, called qubits, or higher-dimensional analogs. A qubit can be represented by the states |0〉, |1〉 or any normalized complex linear superposition of these two. Here, the states contain information obtained from some physical process such as quantum sensing 15 , quantum metrology 16 , quantum networks 17 , quantum control 18 or even quantum analog–digital transduction 19 . Moreover, quantum data can also be the solution to problems obtained on a quantum computer: for example, the preparation of various Hamiltonians’ ground states.

figure 3

a , The classical data x , that is, images of cats and images of dogs, is encoded into a Hilbert space via some map x  → | ψ ( x )〉. Ideally, data from different classes (here represented by dots and stars) are mapped to different regions of the Hilbert space. b , Quantum data | ψ 〉 can be directly analyzed on a quantum device. Here the dataset is composed of states representing metallic or superconducting systems. c , The dataset is used to train a QML model. Two common paradigms in QML are QNNs and quantum kernels, both of which allow for classification of either classical or quantum data. In kernel methods we fit a decision hyperplane that separates the classes. d , Once the model is trained, it can be used to make predictions.

In principle, all classical data can be efficiently encoded in systems of qubits: a classical bitstring of length n can be easily encoded onto n qubits. However, the same cannot be said for the converse, since one cannot efficiently encode quantum data in bit systems; that is, the state of a general n -qubit system requires (2 n  − 1) complex numbers to be specified. Hence, systems of qubits (and more generally the quantum Hilbert space) constitute the ultimate data representation medium, as they can encode not only classical information but also quantum information obtained from physical processes.

In a QML setting, the term quantum data refers to data that are naturally already embedded in a Hilbert space \({{{\mathcal{H}}}}\) . When the data are quantum, they are already in the form of a set of quantum states {| ψ j 〉} or a set of unitaries { U j } that could prepare these states on a quantum device (via the relation | ψ j 〉 =  U j | 0 〉). On the other hand, when the data x are classical, they first need to be encoded in a quantum system through some embedding mapping x j  → | ψ ( x j )〉, with | ψ ( x j )〉 in \({{{\mathcal{H}}}}\) . In this case, the hope is that the QML model can solve the learning task by accessing the exponentially large dimension of the Hilbert space 20 , 21 , 22 , 23 .

One of the most important and reasonable conjectures to make is that the availability of quantum data will substantially increase in the near future. The mere fact that people will use the quantum computers that are available will logically lead to more quantum problems being solved and quantum simulations being performed. These computations will produce quantum datasets, and hence it is reasonable to expect the rapid rise of quantum data. Note that, in the near term, these quantum data will be stored on classical devices in the form of efficient descriptions of quantum circuits that prepare the datasets.

Finally, as our level of control over quantum technologies progresses, coherent transduction of quantum information from the physical world to digital quantum computing platforms may be achieved 19 . This would quantum mechanically mimic the main information acquisition mechanism for classical data from the physical world, this being analog–digital conversion. Moreover, we can expect that the eventual advent of practical quantum error correction 24 and quantum memories 25 will allow us to store quantum data on quantum computers themselves.

Analyzing and learning from data requires a parameterized model, and many different models have been proposed for QML applications. Classical models such as neural networks and tensor networks (as shown in Fig. 1 ) are often useful for analyzing data from quantum experiments. However, due to their novelty, we will focus our discussion on quantum models using quantum algorithms, where one applies the learning methodology directly at the quantum level.

Similarly to classical ML, there exist several different QML paradigms: supervised learning (task based) 26 , 27 , 28 , unsupervised learning (data based) 29 , 30 and reinforced learning (reward based) 31 , 32 . While each of these fields is exciting and thriving in itself, supervised learning has recently received considerable attention for its potential to achieve quantum advantage 26 , 33 , resilience to noise 34 and good generalization properties 35 , 36 , 37 , which makes it a strong candidate for near-term applications. In what follows we discuss two popular QML models: QNNs and quantum kernels, shown in Fig. 3 , with a particular emphasis on QNNs as these are the primary ingredient of several supervised, unsupervised and reinforced learning schemes.

Quantum neural networks

The most basic and key ingredient in QML models is parameterized quantum circuits (PQCs). These involve a sequence of unitary gates acting on the quantum data states | ψ j 〉, some of which have free parameters θ that will be trained to solve the problem at hand 38 . PQCs are conceptually analogous to neural networks, and indeed this analogy can be made precise: that is, classical neural networks can be formally embedded into PQCs 39 .

This has led researchers to refer to certain kinds of PQC as QNNs. In practice, the term QNN is used whenever a PQC is employed for a data science application, and hence we will use the term QNN in what follows. QNNs are employed in all three QML paradigms mentioned above. For instance, in a supervised classification task, the goal of the QNN is to map the states in different classes to distinguishable regions of the Hilbert space 26 . Moreover, in the unsupervised learning scenario of ref. 29 , a clustering task is mapped onto a MAXCUT problem and solved by training a QNN to maximize distance between classes. Finally, in the reinforced learning task of ref. 32 , a QNN can be used as the Q -function approximator, which can be used to determine the best action for a learning agent given its current state.

Figure 4 gives examples of three distinct QNN architectures where in each layer the number of qubits in the model is increased, preserved or decreased. In Fig. 4a we show a dissipative QNN 40 which generalizes the classical feedforward network. Here, each node corresponds to a qubit, while lines connecting qubits are unitary operations. The term dissipative arises from the fact that qubits in a layer are discarded after the information forward-propagates to the (new) qubits in the next layer. Figure 4b shows a standard QNN where quantum data states are sent through a quantum circuit, at the end of which some or all of the qubits are measured. Here, no qubits are discarded or added as we go deeper into the QNN. Finally, Fig. 4c depicts a convolutional QNN 11 , where in each layer qubits are measured to reduce the dimension of the data while preserving its relevant features. Many other QNNs have been proposed 41 , 42 , 43 , 44 , 45 , and constructing QNN architectures is currently an active area of research.

figure 4

a , A classical feedforward neural network has input, hidden and output layers. This can be generalized to the quantum setting with a dissipative QNN, where some qubits are discarded and replaced by new qubits during the algorithm. Here we show a quantum circuit representation for the dissipative QNN. In a circuit diagram each horizontal line represents a qubit, and the logical operations, or quantum gates, are represented by boxes connecting the qubit lines. Circuits are read from left to right. For instance, here the circuit is initialized in a product state \(\left|{\psi }_{j}\right\rangle \otimes {\left|0\right\rangle }^{\otimes ({N}_{\mathrm{h}}+{N}_{\mathrm{o}})}\) , where | ψ j 〉 encodes the input data and N h ( N o ) is the number of ancilla qubits in the hidden (output) layer, which are initialized to the fiduciary state |0〉. As logical operations are performed, the information forward-propagates through the circuit into the ancillary qubits. b , Another possible QNN strategy is to keep the qubits fixed, without discarding or replacing them. The circuit represents consecutive application of two-qubit gates U j and controlled-NOT (denoted by CNOT) gates. c , QCNNs measure and discard qubits during the algorithm. The QCNN circuit considered here is built with two-qubit quantum gates U j and is initialized in | ψ j 〉.

To further accommodate the limitation of near-term quantum computers, we can also employ a hybrid approach with models that have both classical and quantum neural networks 46 . Here, QNNs act coherently on quantum states while deep classical neural networks alleviate the need for higher-complexity quantum processing. Such hybridization distributes the representational capacity and computational complexity across both quantum and classical computers. Moreover, since quantum states generally have a mixture of classical correlations and quantum correlations, hybrid quantum–classical models allow for the use of quantum computers as an additive resource to increase the ability of classical models to represent quantum-correlated distributions. Applications of hybrid models include generating 47 or learning and distilling information 46 from multipartite-entangled distributions.

Quantum kernels

As an alternative to QNNs, researchers have proposed quantum versions of kernel methods 26 , 28 . A kernel method maps each input to a vector in a high-dimensional vector space, known as the reproducing kernel Hilbert space. Then, a kernel method learns a linear function in the reproducing kernel Hilbert space. The dimension of the reproducing kernel Hilbert space could be infinite, which enables the kernel method to be very powerful in terms of expressiveness. To learn a linear function in a potentially infinite-dimensional space, the kernel trick 48 is employed, which only requires efficient computation of the inner product between these high-dimensional vectors. The inner product is also known as the kernel 48 . Quantum kernel methods consider the computation of kernel functions using quantum computers. There are many possible implementations. For example, refs. 26 , 28 considered a reproducing kernel Hilbert space equal to the quantum state space, which is finite dimensional. Another approach 13 is to study an infinite-dimensional reproducing kernel Hilbert space that is equivalent to transforming a classical vector using a quantum computer. It then maps the transformed classical vectors to infinite-dimensional vectors.

Inductive bias

For both QNNs and quantum kernels, an important design criterion is their inductive bias. This bias refers to the fact that any model represents only a subset of functions and is naturally biased towards certain types of function (that is, functions relating the input features to the output prediction). One aspect of achieving quantum advantage with QML is to aim for QML models with an inductive bias that is inefficient to simulate with a classical model. Indeed, it was recently shown 49 that quantum kernels with this property can be constructed, albeit with some subtleties regarding their trainability.

Generally speaking, inductive bias encompasses any assumptions made in the design of the model or the optimization method that bias the search of the potential models to a subset in the set of all possible models. In the language of Bayesian probabilistic theory, we usually call these assumptions our prior. Having a certain parameterization of potential models, such as QNNs, or choosing a particular embedding for quantum kernel methods 13 , 14 , 26 is itself a restriction of the search space, and hence a prior. Adding a regularization term to the optimizer or modulating the learning rate to keep searches geometrically local also adds inherently a prior and focuses the search, and thus provides inductive bias.

Ultimately, inductive biases from the design of the ML model, combined with a choice of training process, are what make or break an ML model. The main advantage of QML will then be to have the ability to sample from and learn models that are (at least partially) natively quantum mechanical. As such, they have inductive biases that classical models do not have. This discussion assumes that the dataset to be represented is quantum mechanical in nature, and is one of the reasons why researchers typically believe that QML has greater promise for quantum rather than classical data.

Training and generalization

The ultimate goal of ML (classical or quantum) is to train a model to solve a given task. Thus, understanding the training process of QML models is fundamental for their success.

Consider the training process, whereby we aim to find the set of parameters θ that lead to the best performance. The latter can be accomplished, for instance, by minimizing a loss function \({{{\mathcal{L}}}}({{{\mathbf{\uptheta }}}})\) encoding the task at hand. Some methods for training QML models are leveraged from classical ML, such as stochastic gradient descent. However, shot noise, hardware noise and unique landscape features often make off-the-shelf classical optimization methods perform poorly for QML training. (This is due to the fact that extracting information from a quantum state requires computing the expectation values of some observable, which in practice need to be estimated via measurements on a noisy quantum computer. Hence, given a finite number of shots (measurement repetitions), these can only be resolved up to some additive errors. Moreover, such expectation values will be subject to corruption due to hardware noise.) This realization led to development of quantum-aware optimizers, which account for the quantum idiosyncrasies of the QML training process. For example, shot-frugal optimizers 50 , 51 , 52 , 53 can employ stochastic gradient descent while adapting the number of shots (or measurements) needed at each iteration, so as not to waste too many shots during the optimization. Quantum natural gradient 54 , 55 adjusts the step size according to the local geometry of the landscape (on the basis of the quantum Fisher information metric). These and other quantum-aware optimizers often outperform standard classical optimization methods in QML training tasks.

For the case of supervised learning, we are interested not only in learning from a training dataset but also in making accurate predictions on (generalizing to) previously unseen data. This translates into achieving small training and prediction errors, with the second usually hinging on the first. Thus, let us now consider prediction error, also known as generalization error, which has been studied only very recently for QML 13 , 14 , 35 , 37 , 56 , 57 . Formally speaking, this error measures the extent to which a trained QML model performs well on unseen data. Prediction error depends on the training error as well as the complexity of the trained model. If the training error is large, the prediction error is also typically large. If the training error is small but the complexity of the trained model is large, then the prediction error is likely still large. The prediction error is small only if training error is itself small and the complexity of the trained model is moderate (that is, sufficiently smaller than training data size) 14 , 35 . The notion of complexity depends on the QML model. We have a good understanding of the complexity of quantum kernel methods 13 , 14 , while more research is needed on QNN complexity. Recent theoretical analysis of QNNs shows that their prediction performance is closely linked to the number of independent parameters in the QNN, with good generalization obtained when the number of training data is roughly equal to the number of parameters 35 . This gives the exciting prospect of using only a small number of training data to obtain good generalization.

Challenges in QML

Heuristic fields can face periods of stagnation (or ‘winters’) due to unforeseen technical challenges. Indeed in classical ML, there was a gap between introducing a single perceptron 58 and the multilayer perceptron 59 (that is, neural network), and there was also a gap between attempts to train multiple layers and the introduction of the backpropagation method 60 .

Naturally we would like to avoid these stagnations or winters for QML. The obvious strategy is to try to determine all of the challenges as quickly as possible, and focus research effort on addressing them. Fortunately, QML researchers have adopted this strategy. Figure 5 showcases some of the different elements of QML models, as well as the challenges associated with them. In this section we detail various QML challenges and how they could potentially be avoided.

figure 5

a , There are several ingredients and priors needed to build a QML model: a dataset (and an encoding scheme for classical data), the choice of parameterized model, loss function and classical optimizer. In this diagram, we show some of the challenges of the different components of the model. b – d , The success of the QML model hinges on an accurate and efficient training of the parameters. However, there are certain phenomena that can hinder the QML trainability. These include the abundance of low-quality local minimum solutions ( b ), as well as the barren plateau phenomenon ( c ). When a QML architecture exhibits a barren plateau, the landscape becomes exponentially flat (on average) as the number of qubits increases (seen as a transition from dashed to solid line). The presence of hardware noise has been shown to erase the features in the landscape as well as potentially shifting the position of the minima. Here, the dashed (solid) line corresponds to the noiseless (noisy) landscape shown in d .

Embedding schemes and quantum datasets

The access to high-quality, standardized datasets has played a key role in advancing classical ML. Hence, one could conjecture that such datasets will be crucial for QML as well.

Currently, most QML architectures are benchmarked using classical datasets (such as MNIST, Dogs vs Cats and Iris). While using classical datasets is natural due to their accessibility, it is still unclear how to best encode classical information onto quantum states. Several embedding schemes have been proposed 22 , 26 , 61 , and there are some properties they must possess. One such property is that the inner product between output states of the embedding is classically hard to simulate (otherwise the quantum kernel would be classically simulable). In addition, the embedding should be practically useful: that is, in a classification task, the states should be in distinguishable regions of the Hilbert space. Unfortunately, embeddings that satisfy one of these properties do not necessarily satisfy the other 62 . Thus, developing encoding schemes is an active area of research, especially those that are equipped with an inductive bias containing information about the dataset 49 .

Furthermore, some recent results suggest that achieving a quantum advantage with classical data might not be straightforward 49 . On the other hand, QML models with quantum data have a more promising route towards a quantum advantage 63 , 64 , 65 , 66 . Despite this fact, there is still a dearth of truly quantum datasets for QML, with just a few recently proposed 67 , 68 . Hence, the field needs standardized quantum datasets with easily preparable quantum states, as these can be used to benchmark QML models on true quantum data.

Quantum landscapes

Training the parameters of the QML model corresponds in a wide array of cases to minimizing a loss function and navigating through a (usually non-convex) loss function landscape in search of its global minimum. Technically speaking, the loss function defines a map from the model’s parameter space to the real values. The loss function value can quantify, for instance, the model’s error in solving a given task, so our goal is to find the set of parameters that minimizes such error. Quantum landscape theory 69 aims to understand QML landscape properties and how to engineer them. Local minima and barren plateaus have received substantial attention in quantum landscape theory.

Local minima in quantum landscapes

As schematically shown in Fig. 5b , similarly to classical ML, the quantum loss landscape can have many local minima. Ultimately, this can lead to the overall non-convex optimization being NP-hard 70 , which is again similar to the classical case. There have been some methods proposed to address local minima. For example, variable structure QNNs 71 , 72 , which grow and contract throughout the optimization, adaptively change the model’s prior and allow some local minima to be turned into saddle points. Moreover, evidence of the overparametrization phenomenon has been seen for QML 73 , 74 . Here, the optimization undergoes a computational phase transition, due to spurious local minima disappearing, whenever the number of parameters exceeds a critical value.

Overview of barren plateaus

Local minima are not the only issue facing QML, as it has been shown that quantum landscapes can exhibit a fascinating property known as a barren plateau 57 , 75 , 76 , 77 , 78 , 79 , 80 , 81 , 82 , 83 , 84 , 85 , 86 , 87 . As depicted in Fig. 5c , in a barren plateau the loss landscape becomes, on average, exponentially flat with the problem size. When this occurs, the valley containing the global minimum also shrinks exponentially with problem size, leading to a so-called narrow gorge 69 . As a consequence, exponential resources (for example, numbers of shots) are required to navigate through the landscape. The latter impacts the complexity of the QML algorithm and can even destroy quantum speedup, since quantum algorithms typically aim to avoid the exponential complexity normally associated with classical algorithms.

Barren plateaus from ignorance or insufficient inductive bias

The barren plateau phenomenon has been studied in deep hardware-efficient QNNs 75 , where they arise due to the high expressivity of the model 79 . By making no assumptions about the underlying data, deep hardware-efficient architectures aim to solve a problem by being able to prepare a wide range of unitary evolutions. In other words, the prior over hypothesis space is relatively uninformed. Barren plateaus in this unsharp prior are caused by ignorance or the lack of sufficient inductive bias, and therefore a means to avoid them is to input knowledge into the construction of the QNN—making the design of QNNs with good inductive biases for the problem at hand a key solution.

Fortunately various strategies have been developed to address these barren plateaus, such as clever initialization 88 , pretraining and parameter correlation 80 , 81 . These are all examples of adding a sharper prior to the search over the overexpressive parameterizations of hardware-efficient QNNs. Below we further discuss how QNN architectures can be designed to further introduce inductive bias.

Barren plateaus from global observables

Other mechanisms have been linked to barren plateaus. Simply defining a loss function based on a global observable (that is, observables measuring all qubits) leads to barren plateaus even for shallow circuits with sharp priors 76 , while local observables (those comparing quantum states at the single-qubit level) avoid this issue 76 , 85 . The latter is due not to bad inductive biases but rather to the fact that comparing objects in exponentially large Hilbert spaces requires an exponential precision, as their overlap is usually exponentially small.

Barren plateaus from entanglement

While entanglement is one of the most important quantum resources for information processing tasks in quantum computers, it can also be detrimental for QML models. QNNs (or embedding schemes) that generate too much entanglement also lead to barren plateaus 82 , 84 , 86 . Here, the issue arises when the visible qubits of the QNN (those that are measured at the QNN’s output) are entangled with a large number of qubits in the hidden layers. Due to entanglement, the information of the state is stored in non-local correlations across all qubits, and hence the reduced state of the visible qubits concentrates around the maximally mixed state. This type of barren plateau can be solved by taming the entanglement generated across the QNN.

QNN architecture design

One of the most active areas is developing QNN architectures that have sharp priors. Since QNNs are a fundamental ingredient in supervised learning (deep learning, kernel methods), but also in unsupervised learning and reinforced learning, developing good QNN architectures is crucial for the field.

For instance, it has been shown that QNNs with sharp priors can avoid issues such as barren plateaus altogether. One such example is quantum convolutional neural networks (QCNNs) 11 . QCNNs possess an inductive bias from having a prior over the space of architectures that is much sharper than that of deep hardware-efficient architectures, as QCNNs are restricted to be hierarchically structured and translationally invariant. The notable reduction in the expressivity and parameter space dimension from this translational invariance assumption yields the greater trainability 80 .

The idea of embedding knowledge about the problem and dataset into our models (to achieve helpful inductive bias) will be key to improve the trainability of QML models. Recent proposals use quantum graph neural networks 89 for scenarios where quantum subsystems live on a graph, and potentially have further symmetries. For instance, the underlying graph-permutation symmetries of a quantum communication dataset were taken into account by a quantum graph convolutional network. Similarly, a quantum recurrent neural network has been used in scenarios where temporal recurrence of parameters occurs—for example, in the quantum dynamics of a stationary (time-dependent) quantum dynamical process.

To better understand how to go beyond the aforementioned inductive biases from temporal and/or translational invariance in grids and graphs, we can take inspiration from recent advances in the theory of classical deep learning. In classical ML, the study of the group theory behind graph neural networks, namely the concepts of invariance and equivariance to various group actions on the input space, has led to a unifying theory of deep learning architectures based on group theory, called geometric deep learning theory 90 .

To have a prescription to create arbitrary architectures and inductive biases suitable for a given set of quantum physical data, a theory of quantum geometric deep learning could be key to design architectures with the right prior over the transformation space and inductive biases to ensure trainability and generalization. As the study of physics is often about the identification of inherent or emergent symmetries in particular systems, there is great potential for a future unifying theory of quantum geometric deep learning to provide consistent methods to create QML model architectures with inductive biases encoding knowledge of the basic symmetries and principles of the quantum physical system underlying given quantum datasets. This approach has been recently explored in refs. 91 , 92 , 93 . Moreover, the works 74 , 94 have also shown that the Lie algebra obtained from the generators of the QNN can be linked to properties of the QML landscape such as the presence of barren plateaus or the overparametrization phenomenon.

Effect of quantum noise

The presence of hardware noise during quantum computations is one of the defining characteristics of noisy intermediate-scale quantum (NISQ) computing. Despite this fact, most QML research neglects noise in the analytical calculations and numerical simulations while still promising that the methods are near-term compatible. Accounting for the effects of hardware noise should be a crucial aspect of QML analysis if we wish to pursue a quantum advantage with currently available hardware.

Noise corrupts the information as it forward-propagates in a quantum circuit, meaning that deeper circuits with longer run-times will be particularly affected. As such, noise affects all aspects of the model that make use of quantum computers. This includes the dataset preparation scheme as well as circuits used to compute quantum kernels. Moreover, when using QNNs, noise can hinder their trainability as it leads to noise-induced barren plateaus 87 , 95 . Here, the relevant features of the landscape become exponentially suppressed by noise as the depth of the circuit increases (Fig. 5d ). Ultimately, the effects of noise translate into a deformation of the inductive bias of the model from its original one, and an effective reduction of the dimension of the quantum feature space. Despite the critical impact of quantum noise, its effects are still largely unexplored, particularly its impact on the classical simulability of the QML model 96 , 97 .

Addressing noise-induced issues will likely require either (1) reduction in hardware error rates, (2) partial quantum error correction 98 or (3) employing QNNs that are relatively shallow (that is, whose depth grows sublinearly in the problem size) 87 , such as QCNNs. Error mitigation techniques 99 , 100 , 101 can also improve performance of QML models in the presence of noise, although they may not solve noise-induced trainability issues 95 . A different approach to dealing with noise is to engineer QML models with noise-resilient properties 34 , 102 , 103 (such as the position of the minima not changing due to noise).

Potential for quantum advantage

The first quantum advantages in QML will likely arise from hidden parameter extraction from quantum data. This can be for quantum sensing or quantum state classification/regression. Fundamentally, we know from the theory of optimal measurement that non-local quantum measurements can extract hidden parameters using fewer samples. Using QML, one can form and search over a parameterization of hypotheses for such measurements.

This is particularly useful when such optimal measurements are not known a priori—for example, identifying the measurement that extracts an order parameter or identifies a particular phase of matter. As the information about this classical parameter is embedded in the structure of quantum correlations between subsystems, it is natural that a trained QML model with good inductive biases can exhibit an advantage over local measurements and classical representations.

Another area of application where classical parameter extraction may yield an advantage is in quantum machine perception 46 , 63 , 104 , 105 , 106 , 107 , that is, quantum sensing, metrology and beyond. Here, leveraging the variational search over multipartite-entangled states for input to exposure to a quantum signal along with the optimization for optimal control and/or over post-processing schemes can find optimal measurements for the estimation of hidden parameters in the incoming signal. In particular, the variational approach may be able to find the optimal entanglement, exposure and measurement scheme that filters signal from noise 108 , akin to variationally learning the quantum error correcting code that filters signal from noise, but instead applied to quantum metrology.

Beyond classical parameter extraction embedded in quantum data, there may be an advantage for the discovery of quantum error correcting codes 109 . Quantum error correcting codes fundamentally encode data (typically) non-locally into a subsystem or subspace of the Hilbert space. As deep learning is fundamentally about the discovery of submanifolds of data space, identifying and decoding subspaces/subsystems from a Hilbert space that correspond to a quantum error correction subspace/subsystem is a natural area where differentiable quantum computing may yield an advantage. This is barely explored, mainly due to the difficulty of gaining insights with small-scale numerical simulations. Fundamentally, it is akin to a quantum data version of classical parameter embedding/extraction advantage.

Finally, a quantum advantage for generative modeling may be achieved when ground states 110 , equilibrium states 47 , 111 or quantum dynamics 112 can be generated using models incorporating QNNs, in a situation where the distribution cannot be sampled classically, and yields more accurate predictions or more extensive generalization compared with classical ML approaches. The nearest-term possibility for demonstrating such an advantage would likely be from variational optimization at the continuous time optimal control level on analog quantum simulators.

What will quantum advantage look like?

When the data originate from quantum-mechanical processes, such as from experiments in chemistry, material science, biology and physics, it is more likely to see exponential quantum advantage in ML. The quantum advantage could be in sample complexity or time complexity. An exponential advantage in sample complexity always implies an exponential advantage in time complexity, but the reverse is not generally true. It was recently shown 63 , 65 , 113 , 114 that there is an exponential quantum advantage in sample complexity when we can use a quantum sensor, quantum memory and quantum computer to retrieve, store and process quantum information from experiments. Such a sample complexity advantage can be proven rigorously without the possibility of being dequantized 12 , 64 , 115 in the future, that is, it is impossible to find improved classical algorithms such that there is no exponential advantage. This substantial quantum advantage has recently been demonstrated on the Sycamore processor 63 raising the hope for achieving quantum advantage using NISQ devices 116 .

The situation for advantage in time complexity is more subtle. Classical simulation of quantum process is intractable in many cases, hence exponential advantage in time complexity would be expected to be prevalent. However, we should be cautious about the availability of data in ML tasks, which makes classical ML algorithms computationally more powerful 13 , 117 . For instance, ref. 117 shows that in the worst case there is no exponential quantum advantage in predicting ground-state properties in geometrically local gapped Hamiltonians. Furthermore, the emergence of effective classical theory in quantum-mechanical processes could enable classical machines to provide accurate predictions. For example, density functional theory 118 , 119 allows accurate prediction of molecular properties when we have an accurate approximation to the exchange–correlation functionals by conducting real-world experiments. It is still likely that an exponential advantage is possible in physical systems of practical interest, but there are no rigorous proofs yet.

When the data are of a purely classical origin, such as in applications for recommending products to customers 12 , performing portfolio optimization 120 , 121 and processing human languages 122 and everyday images 123 , there is no known exponential advantage 115 . However, it is still reasonable to expect polynomial advantage. Furthermore, a quadratic advantage can be rigorously proven 124 , 125 for purely classical problems. Therefore, we likely have a potential impact in the long term when we have fault-tolerant quantum computers, albeit with the speedup notably dampened by the overheads of quantum error correction 126 for currently known fault-tolerant quantum computing schemes.

Transition to the fault-tolerant era and beyond

While QML has been proposed as a candidate to achieve a quantum advantage in the near term using NISQ devices, we can still pose a question about its usability in the future. Here, researchers envision two different chronological eras post-NISQ. In the first, which we can refer to as ‘partial error corrected’, quantum computers will have enough physical qubits (a couple of hundred), and sufficiently small error rates, to allow for a small number of fully error-corrected logical qubits. Since one logical qubit is comprised of multiple physical qubits, in this era we will have the freedom to trade off and split the qubits in the device into a subset of error-corrected qubits along with a subset of non-error-corrected qubits. The next era, that is, the ‘fault-tolerant’ era, will arise when the quantum hardware has a large number of error-corrected qubits.

Indeed, we can easily envision QML being useful in both of these post-NISQ eras. First, in the partial error-corrected era, QML models will be able to execute high-fidelity circuits and thus have an improved performance. This will naturally enhance the trainability of the models by mitigating noise-induced barren plateaus, and also reduce noise-induced classification errors in QML models. Most importantly, QML will likely see its most widespread and critical use during the fault-tolerant era. Here, quantum algorithms such as those for quantum simulation 127 , 128 will be able to accurately prepare quantum data, and to faithfully store it in quantum memories 129 . Therefore QML will be the natural model to learn, infer and make predictions from quantum data, as here the quantum computer will learn from the data themselves directly.

On the further-term horizon, we anticipate it will be possible to capture quantum data from nature directly via transduction from their natural analog form to one that is quantum digital (for example, via quantum analog–digital interconversion 19 ). These data will then be able to be shuttled around quantum networks for distributed and/or centralized processing with QML models, using fault-tolerant quantum computation and error-corrected quantum communication. At this point, QML will have reached a stage similar to that where ML is today, where edge sensors capture data, the data are relayed to a central cloud and ML models are trained on the aggregated data. As the modern advent of widespread classical ML arose at this point of abundant data, one could anticipate that ubiquitous access to quantum data in the fault-tolerant era could similarly propel QML to even greater widespread use.

Nielsen, M. A. & Chuang, I. L. Quantum Computation and Quantum Information (Cambridge Univ. Press, 2000).

Brookes, J. C. Quantum effects in biology: golden rule in enzymes, olfaction, photosynthesis and magnetodetection. Proc. R. Soc. A 473 , 20160822 (2017).

Article   Google Scholar  

Deutsch, D. Quantum theory, the Church–Turing principle and the universal quantum computer. Proc. R. Soc. A 400 , 97–117 (1985).

MathSciNet   MATH   Google Scholar  

Wiebe, N., Kapoor, A. & Svore, K. M. Quantum deep learning. Preprint at https://arxiv.org/abs/1412.3489 (2014).

Schuld, M., Sinayskiy, I. & Petruccione, F. An introduction to quantum machine learning. Contemp. Phys. 56 , 172–185 (2015).

Article   MATH   Google Scholar  

Biamonte, J. et al. Quantum machine learning. Nature 549 , 195–202 (2017).

Arute, F. Quantum supremacy using a programmable superconducting processor. Nature 574 , 505–510 (2019).

Cincio, L., Subaşí, Y., Sornborger, A. T. & Coles, P. J. Learning the quantum algorithm for state overlap. New J. Phys. 20 , 113022 (2018).

Tranter, A. D. Multiparameter optimisation of a magneto-optical trap using deep learning. Nat. Commun. 9 , 4360 (2018).

Kaubruegger, R., Vasilyev, D. V., Schulte, M., Hammerer, K. & Zoller, P. Quantum variational optimization of Ramsey interferometry and atomic clocks. Phys. Rev. X 11 , 041045 (2021).

Google Scholar  

Cong, I., Choi, S. & Lukin, M. D. Quantum convolutional neural networks. Nat. Phys. 15 , 1273–1278 (2019).

Tang, E. A quantum-inspired classical algorithm for recommendation systems. In Proc. 51st Annual ACM SIGACT Symposium on Theory of Computing 217–228 (Association for Computing Machinery, 2019).

Huang, H.-Y. Power of data in quantum machine learning. Nat. Commun. 12 , 2631 (2021).

Banchi, L., Pereira, J. & Pirandola, S. Generalization in quantum machine learning: a quantum information standpoint. PRX Quantum 2 , 040321 (2021).

Degen, C. L., Reinhard, F. & Cappellaro, P. Quantum sensing. Rev. Mod. Phys. 89 , 035002 (2017).

Article   MathSciNet   Google Scholar  

Giovannetti, V., Lloyd, S. & Maccone, L. Advances in quantum metrology. Nat. Photon. 5 , 222–229 (2011).

Chiribella, G., D’Ariano, G. M. & Perinotti, P. Theoretical framework for quantum networks. Phys. Rev. A 80 , 022339 (2009).

Article   MathSciNet   MATH   Google Scholar  

D’Alessandro, D. Introduction to Quantum Control and Dynamics (Chapman & Hall/CRC Applied Mathematics & Nonlinear Science, Taylor & Francis, 2007).

Verdon-Akzam, G. Quantum analog–digital interconversion for encoding and decoding quantum signals. US patent application 17,063,595 (2020).

Rebentrost, P., Mohseni, M. & Lloyd, S. Quantum support vector machine for big data classification. Phys. Rev. Lett. 113 , 130503 (2014).

Schuld, M. & Killoran, N. Quantum machine learning in feature Hilbert spaces. Phys. Rev. Lett. 122 , 040504 (2019).

Lloyd, S., Schuld, M., Ijaz, A., Izaac, J. & Killoran, N. Quantum embeddings for machine learning. Preprint at https://arxiv.org/abs/2001.03622 (2020).

Schuld, M., Sweke, R. & Meyer, J. J. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A 103 , 032430 (2021).

Roffe, J. Quantum error correction: an introductory guide. Contemp. Phys. 60 , 226–245 (2019).

Shor, P. W. Scheme for reducing decoherence in quantum computer memory. Phys. Rev. A 52 , R2493 (1995).

Havlíček, V. Supervised learning with quantum-enhanced feature spaces. Nature 567 , 209–212 (2019).

Liu, Y., Arunachalam, S. & Temme, K. A rigorous and robust quantum speed-up in supervised machine learning. Nat. Phys. 17 , 1013–1017 (2021).

Schuld, M. Supervised quantum machine learning models are kernel methods. Preprint at https://arxiv.org/abs/2101.11020 (2021).

Otterbach, J. S. et al. Unsupervised machine learning on a hybrid quantum computer. Preprint at https://arxiv.org/abs/1712.05771 (2017).

Kerenidis, I., Landman, J., Luongo, A. & Prakash, A. q-means: a quantum algorithm for unsupervised machine learning. In Advances in Neural Information Processing Systems Vol. 32 (eds Wallach, H. M. et al.) 4136–4146 (Curran, 2019).

Saggio, V. Experimental quantum speed-up in reinforcement learning agents. Nature 591 , 229–233 (2021).

Skolik, A., Jerbi, S. & Dunjko, V. Quantum agents in the gym: a variational quantum algorithm for deep q-learning. Quantum 6 , 720 (2022).

Huang, H.-Y. et al. Quantum advantage in learning from experiments. Science 376 , 1182–1186 (2022).

LaRose, R. & Coyle, B. Robust data encodings for quantum classifiers. Phys. Rev. A 102 , 032420 (2020).

Caro, M. C. et al. Generalization in quantum machine learning from few training data. Nat. Commun. 13 , 4919 (2022).

Caro, M. C. et al. Out-of-distribution generalization for learning quantum dynamics. Preprint at https://arxiv.org/abs/2204.10268 (2022).

Caro, M. C., Gil-Fuster, E., Meyer, J. J., Eisert, J. & Sweke, R. Encoding-dependent generalization bounds for parametrized quantum circuits. Quantum 5 , 582 (2021).

Cerezo, M. Variational quantum algorithms. Nat. Rev. Phys. 3 , 625–644 (2021).

Wan, K. H., Dahlsten, O., Kristjánsson, H., Gardner, R. & Kim, M. S. Quantum generalisation of feedforward neural networks. npj Quantum Inf. 3 , 36 (2017).

Beer, K. Training deep quantum neural networks. Nat. Commun. 11 , 808 (2020).

Schuld, M., Sinayskiy, I. & Petruccione, F. The quest for a quantum neural network. Quantum Inf. Process. 13 , 2567–2586 (2014).

Dallaire-Demers, P.-L. & Killoran, N. Quantum generative adversarial networks. Phys. Rev. A 98 , 012324 (2018).

Farhi, E. & Neven, H. Classification with quantum neural networks on near term processors. Preprint at https://arxiv.org/abs/1802.06002 (2018).

Killoran, N. Continuous-variable quantum neural networks. Phys. Rev. Res. 1 , 033063 (2019).

Bausch, J. Recurrent quantum neural networks. Adv. Neural. Inf. Process. Syst. 33 , 1368–1379 (2020).

Broughton, M. et al. TensorFlow Quantum: a software framework for quantum machine learning. Preprint at https://arxiv.org/abs/2003.02989 (2020).

Verdon, G., Marks, J., Nanda, S., Leichenauer, S. & Hidary, J. Quantum Hamiltonian-based models and the variational quantum thermalizer algorithm. Preprint at https://arxiv.org/abs/1910.02071 (2019).

Cortes, C. & Vapnik, V. Support-vector networks. Mach. Learn. 20 , 273–297 (1995).

Kübler, J. M., Buchholz, S. & Schölkopf, B. The inductive bias of quantum kernels. Adv. Neural. Inf. Process. Syst. 34 , 12661–12673 (2021).

Kübler, J. M., Arrasmith, A., Cincio, L. & Coles, P. J. An adaptive optimizer for measurement-frugal variational algorithms. Quantum 4 , 263 (2020).

Arrasmith, A., Cincio, L., Somma, R. D. & Coles, P. J. Operator sampling for shot-frugal optimization in variational algorithms. Preprint at https://arxiv.org/abs/2004.06252 (2020).

Gu, A., Lowe, A., Dub, P. A., Coles, P. J. & Arrasmith, A. Adaptive shot allocation for fast convergence in variational quantum algorithms. Preprint at https://arxiv.org/abs/2108.10434 (2021).

Sweke, R. Stochastic gradient descent for hybrid quantum–classical optimization. Quantum 4 , 314 (2020).

Stokes, J., Izaac, J., Killoran, N. & Carleo, G. Quantum natural gradient. Quantum 4 , 269 (2020).

Koczor, B. & Benjamin, S. C. Quantum natural gradient generalised to non-unitary circuits. Preprint at https://arxiv.org/abs/1912.08660 (2019).

Sharma, K. et al. Reformulation of the no-free-lunch theorem for entangled data sets. Phys. Rev. Lett. 128 , 070501 (2022).

Abbas, A. The power of quantum neural networks. Nat. Comput. Sci. 1 , 403–409 (2021).

Rosenblatt, F. The Perceptron, a Perceiving and Recognizing Automaton (Project PARA) Report No. 85-460-1 (Cornell Aeronautical Laboratory, 1957).

Haykin, S. Neural Networks: a Comprehensive Foundation (Prentice Hall, 1994).

Rumelhart, D. E., Hinton, G. E. & Williams, R. J. Learning representations by back-propagating errors. Nature 323 , 533–536 (1986).

Hubregtsen, T. et al. Training quantum embedding kernels on near-term quantum computers. Preprint at https://arxiv.org/abs/2105.02276 (2021).

Thanasilp, S., Wang, S., Nghiem, N. A., Coles, P. J. & Cerezo, M. Subtleties in the trainability of quantum machine learning models. Preprint at https://arxiv.org/abs/2110.14753 (2021).

Huang, H.-Y. Quantum advantage in learning from experiments. Science 376 , 1182–1186 (2022).

Cotler, J., Huang, H.-Y. & McClean, J. R. Revisiting dequantization and quantum advantage in learning tasks. Preprint at https://arxiv.org/abs/2112.00811 (2021).

Chen, S., Cotler, J., Huang, H.-Y. & Li, J. A hierarchy for replica quantum advantage. Preprint at https://arxiv.org/abs/2111.05874 (2021).

Chen, S., Cotler, J., Huang, H.-Y. & Li, J. Exponential separations between learning with and without quantum memory. In 2021 IEEE 62nd Annual Symp. on Foundations of Computer Science (FOCS) 574–585 (IEEE, 2022).

Perrier, E., Youssry, A. & Ferrie, C. QDataSet: quantum datasets for machine learning. Preprint at https://arxiv.org/abs/2108.06661 (2021).

Schatzki, L., Arrasmith, A., Coles, P. J. & Cerezo, M. Entangled datasets for quantum machine learning. Preprint at https://arxiv.org/abs/2109.03400 (2021).

Arrasmith, A., Holmes, Z., Cerezo, M. & Coles, P. J. Equivalence of quantum barren plateaus to cost concentration and narrow gorges. Quantum Sci. Technol. 7 , 045015 (2022).

Bittel, L. & Kliesch, M. Training variational quantum algorithms is NP-hard. Phys. Rev. Lett. 127 , 120502 (2021).

Bilkis, M., Cerezo, M., Verdon, G., Coles, P. J. & Cincio, L. A semi-agnostic ansatz with variable structure for quantum machine learning. Preprint at https://arxiv.org/abs/2103.06712 (2021).

LaRose, R., Tikku, A., O’Neel-Judy, É., Cincio, L. & Coles, P. J. Variational quantum state diagonalization. npj Quantum Inf. 5 , 57 (2019).

Kiani, B. T., Lloyd, S. & Maity, R. Learning unitaries by gradient descent. Preprint https://arxiv.org/abs/2001.11897 (2020).

Larocca, M., Ju, N., García-Martín, D., Coles, P. J. & Cerezo, M. Theory of overparametrization in quantum neural networks. Preprint at https://arxiv.org/abs/2109.11676 (2021).

McClean, J. R., Boixo, S., Smelyanskiy, V. N., Babbush, R. & Neven, H. Barren plateaus in quantum neural network training landscapes. Nat. Commun. 9 , 4812 (2018).

Cerezo, M., Sone, A., Volkoff, T., Cincio, L. & Coles, P. J. Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nat. Commun. 12 , 1791 (2021).

Cerezo, M. & Coles, P. J. Higher order derivatives of quantum neural networks with barren plateaus. Quantum Sci. Technol. 6 , 035006 (2021).

Arrasmith, A., Cerezo, M., Czarnik, P., Cincio, L. & Coles, P. J. Effect of barren plateaus on gradient-free optimization. Quantum 5 , 558 (2021).

Holmes, Z., Sharma, K., Cerezo, M. & Coles, P. J. Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum 3 , 010313 (2022).

Pesah, A. Absence of barren plateaus in quantum convolutional neural networks. Phys. Rev. X 11 , 041011 (2021).

Volkoff, T. & Coles, P. J. Large gradients via correlation in random parameterized quantum circuits. Quantum Sci. Technol. 6 , 025008 (2021).

Sharma, K., Cerezo, M., Cincio, L. & Coles, P. J. Trainability of dissipative perceptron-based quantum neural networks. Phys. Rev. Lett. 128 , 180505 (2022).

Holmes, Zoë Barren plateaus preclude learning scramblers. Phys. Rev. Lett. 126 , 190501 (2021).

Marrero, C. O., Kieferova, M. & Wiebe, N. Entanglement induced barren plateaus. PRX Quantum 2 , 040316 (2020).

Uvarov, A. V. & Biamonte, J. D. On barren plateaus and cost function locality in variational quantum algorithms. J. Phys. A 54 , 245301 (2021).

Patti, T. L., Najafi, K., Gao, X. & Yelin, S. F. Entanglement devised barren plateau mitigation. Phys. Rev. Res. 3 , 033090 (2021).

Wang, S. Noise-induced barren plateaus in variational quantum algorithms. Nat. Commun. 12 , 6961 (2021).

Verdon, G. et al. Learning to learn with quantum neural networks via classical neural networks. Preprint at https://arxiv.org/abs/1907.05415 (2019).

Verdon, G. et al. Quantum graph neural networks. Preprint at https://arxiv.org/abs/1909.12264 (2019).

Bronstein, M. M., Bruna, J., Cohen, T. & Veličković, P. Geometric deep learning: grids, groups, graphs, geodesics, and gauges. Preprint at https://arxiv.org/abs/2104.13478 (2021).

Larocca, M. et al. Group-invariant quantum machine learning. Preprint at https://arxiv.org/abs/2205.02261 (2022).

Skolik, A., Cattelan, M., Yarkoni, S., Bäck, T. & Dunjko, V. Equivariant quantum circuits for learning on weighted graphs. Preprint at https://arxiv.org/abs/2205.06109 (2022).

Meyer, J. J. et al. Exploiting symmetry in variational quantum machine learning. Preprint at https://arxiv.org/abs/2205.06217 (2022).

Larocca, M. et al. Diagnosing barren plateaus with tools from quantum optimal control. Preprint at https://arxiv.org/abs/2105.14377 (2021).

Wang, S. et al. Can error mitigation improve trainability of noisy variational quantum algorithms? Preprint at https://arxiv.org/abs/2109.01051 (2021).

Deshpande, A. et al. Tight bounds on the convergence of noisy random circuits to the uniform distribution. Preprint at https://arxiv.org/abs/2112.00716 (2021).

Hakkaku, S., Tashima, Y., Mitarai, K., Mizukami, W. & Fujii, K. Quantifying fermionic nonlinearity of quantum circuits. Preprint at https://arxiv.org/abs/2111.14599 (2021).

Bultrini, D. et al. The battle of clean and dirty qubits in the era of partial error correction. Preprint at https://arxiv.org/abs/2205.13454 (2022).

Temme, K., Bravyi, S. & Gambetta, J. M. Error mitigation for short-depth quantum circuits. Phys. Rev. Lett. 119 , 180509 (2017).

Czarnik, P., Arrasmith, A., Coles, P. J. & Cincio, L. Error mitigation with Clifford quantum-circuit data. Quantum 5 , 592 (2021).

Endo, S., Cai, Z., Benjamin, S. C. & Yuan, X. Hybrid quantum–classical algorithms and quantum error mitigation. J. Phys. Soc. Jpn 90 , 032001 (2021).

Sharma, K., Khatri, S., Cerezo, M. & Coles, P. J. Noise resilience of variational quantum compiling. New J. Phys. 22 , 043006 (2020).

Cincio, L., Rudinger, K., Sarovar, M. & Coles, P. J. Machine learning of noise-resilient quantum circuits. PRX Quantum 2 , 010324 (2021).

Ho, A., Verdon, G. & Mohseni, M. Quantum machine perception. US patent application 17,019,564 (2020).

Meyer, J. J., Borregaard, J. & Eisert, J. A variational toolbox for quantum multi-parameter estimation. NPJ Quantum Inf. 7 , 89 (2021).

Beckey, J. L., Cerezo, M., Sone, A. & Coles, P. J. Variational quantum algorithm for estimating the quantum Fisher information. Phys. Rev. Res. 4 , 013083 (2022).

Wang, J. Experimental quantum Hamiltonian learning. Nat. Phys. 13 , 551–555 (2017).

Layden, D. & Cappellaro, P. Spatial noise filtering through error correction for quantum sensing. npj Quantum Inf. 4 , 30 (2018).

Johnson, P. D., Romero, J., Olson, J., Cao, Y. & Aspuru-Guzik, A. QVECTOR: an algorithm for device-tailored quantum error correction. Preprint at https://arxiv.org/abs/1711.02249 (2017).

Peruzzo, A. A variational eigenvalue solver on a photonic quantum processor. Nat. Commun, 5 , 4213 (2014).

McArdle, S. Variational ansatz-based quantum simulation of imaginary time evolution. npj Quantum Inf. 5 , 75 (2019).

Cirstoiu, C. Variational fast forwarding for quantum simulation beyond the coherence time. npj Quantum Inf. 6 , 82 (2020).

Huang, H.-Y., Kueng, R. & Preskill, J. Information-theoretic bounds on quantum advantage in machine learning. Phys. Rev. Lett. 126 , 190505 (2021).

Aharonov, D., Cotler, J. & Qi, X.-L. Quantum algorithmic measurement. Nat. Commun. 13 , 887 (2022).

Chia, N.-H. et al. Sampling-based sublinear low-rank matrix arithmetic framework for dequantizing quantum machine learning. In Proc. 52nd Annual ACM SIGACT Symposium on Theory of Computing 387–400 (Association for Computing Machinery, 2020).

Preskill, J. Quantum computing in the NISQ era and beyond. Quantum 2 , 79 (2018).

Huang, H.-Y., Kueng, R., Torlai, G., Albert, V. V. & Preskill, J. Provably efficient machine learning for quantum many-body problems. Preprint at https://arxiv.org/abs/2106.12627 (2021).

Hohenberg, P. & Kohn, W. Inhomogeneous electron gas. Phys. Rev. 136 , B864–B871 (1964).

Kohn, W. Nobel lecture: Electronic structure of matter—wave functions and density functionals. Rev. Mod. Phys. 71 , 1253–1266 (1999).

Alcazar, J., Leyton-Ortega, V. & Perdomo-Ortiz, A. Classical versus quantum models in machine learning: insights from a finance application. Mach. Learn. Sci. Technol. 1 , 035003 (2020).

Bouland, A., van Dam, W., Joorati, H., Kerenidis, I. & Prakash, A. Prospects and challenges of quantum finance. Preprint at https://arxiv.org/abs/2011.06492 (2020).

Manning C. & Schutze, H. Foundations of Statistical Natural Language Processing (MIT Press, 1999).

Russ, J. C. The Image Processing Handbook (CRC Press, 2006).

Grover, L. K. A fast quantum mechanical algorithm for database search. In Proc. 28th Annual ACM Symposium on Theory of Computing 212–219 (Association for Computing Machinery, 1996).

Bernstein, E. & Vazirani, U. Quantum complexity theory. SIAM J. Comput. 26 , 1411–1473 (1997).

Babbush, R. Focus beyond quadratic speedups for error-corrected quantum advantage. PRX Quantum 2 , 010103 (2021).

Georgescu, I. M., Ashhab, S. & Nori, F. Quantum simulation. Rev. Mod. Phys. 86 , 153–185 (2014).

Berry, D. W., Childs, A. M., Cleve, R., Kothari, R. & Somma, R. D. Simulating Hamiltonian dynamics with a truncated Taylor series. Phys. Rev. Lett. 114 , 090502 (2015).

Lvovsky, A. I., Sanders, B. C. & Tittel, W. Optical quantum memory. Nat. Photon. 3 , 706–714 (2009).

Sanchez-Lengeling, B. & Aspuru-Guzik, Alán Inverse molecular design using machine learning: generative models for matter engineering. Science 361 , 360–365 (2018).

Download references

Acknowledgements

M.C. acknowledges support from the Los Alamos National Laboratory (LANL) LDRD program under project number 20210116DR. M.C. was also supported by the Center for Nonlinear Studies at LANL. L.C. and P.J.C. were supported by the US Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under the Accelerated Research in Quantum Computing program. L.C. also acknowledges support from US Department of Energy, Office of Science, National Quantum Information Science Research Centers, Quantum Science Center. P.J.C. was also supported by the NNSA’s Advanced Simulation and Computing Beyond Moore’s Law Program at LANL. G.V. thanks F. Sbahi, A. J. Martinez and P. Velickovic for useful discussions. X, formerly known as Google[x], is part of the Alphabet family of companies, which includes Google, Verily, Waymo and others ( www.x.company ). H.-Y.H. is supported by a Google PhD fellowship.

Author information

Authors and affiliations.

Information Sciences, Los Alamos National Laboratory, Los Alamos, NM, USA

Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM, USA

Quantum Science Center, Oak Ridge, TN, USA

M. Cerezo, Lukasz Cincio & Patrick J. Coles

X, Mountain View, CA, USA

Guillaume Verdon

Institute for Quantum Computing, University of Waterloo, Waterloo, Ontario, Canada

Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, Canada

Institute for Quantum Information and Matter, California Institute of Technology, Pasadena, CA, USA

Hsin-Yuan Huang

Department of Computing and Mathematical Sciences, California Institute of Technology, Pasadena, CA, USA

Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM, USA

Lukasz Cincio & Patrick J. Coles

You can also search for this author in PubMed   Google Scholar

Contributions

P.J.C. drafted the manuscript structure. The manuscript was written and revised by M.C., G.V., H.-Y.H., L.C. and P.J.C.

Corresponding author

Correspondence to Patrick J. Coles .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Peer review

Peer review information.

Nature Computational Science thanks Dan Browne and Mile Gu for their contribution to the peer review of this work. Primary Handling Editor: Jie Pan, in collaboration with the Nature Computational Science team.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cite this article.

Cerezo, M., Verdon, G., Huang, HY. et al. Challenges and opportunities in quantum machine learning. Nat Comput Sci 2 , 567–576 (2022). https://doi.org/10.1038/s43588-022-00311-3

Download citation

Received : 05 January 2022

Accepted : 04 August 2022

Published : 15 September 2022

Issue Date : September 2022

DOI : https://doi.org/10.1038/s43588-022-00311-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Enhancing the expressivity of quantum neural networks with residual connections.

  • Jingwei Wen
  • Zhiguo Huang

Communications Physics (2024)

Harnessing quantum information to advance computing

Nature Computational Science (2024)

A recurrent Gaussian quantum network for online processing of quantum time series

  • Robbe De Prins
  • Guy Van der Sande
  • Peter Bienstman

Scientific Reports (2024)

Theoretical guarantees for permutation-equivariant quantum neural networks

  • Louis Schatzki
  • Martín Larocca

npj Quantum Information (2024)

On the sample complexity of quantum Boltzmann machine learning

  • Luuk Coopmans
  • Marcello Benedetti

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

research paper about quantum computers

COMMENTS

  1. Quantum Computing Review: A Decade of Research

    Quantum computing (QC) has the potential to be the next abstruse technology, with a wide range of possible applications and ramifications for organizations and markets. QC provides an exponential speedup by employing quantum mechanics principles, including superposition and entanglement. The potential advantages offered by the revolutionary paradigm have propelled scientific productions ...

  2. 40 years of quantum computing

    Nature Reviews Physics 4 , 1 ( 2022) Cite this article. This year we celebrate four decades of quantum computing by looking back at the milestones of the field and forward to the challenges and ...

  3. Quantum Computing: Circuits, Algorithms, and Applications

    Quantum computing, a transformative field that emerged from quantum mechanics and computer science, has gained immense attention for its potential to revolutionize computation. This paper aims to address the fundamentals of quantum computing and provide a comprehensive guide for both novices and experts in the field of quantum computing. Beginning with the foundational principles of quantum ...

  4. Quantum computers: what are they good for?

    A circuit design for IBM's five-qubit superconducting quantum computer. Credit: IBM Research/SPL. This is where the scepticism about quantum computing begins. The world's largest quantum ...

  5. [2403.02240] Quantum Computing: Vision and Challenges

    To better understand quantum computing, this paper examines the foundations and vision based on current research in this area. We discuss cutting-edge developments in quantum computer hardware advancement and subsequent advances in quantum cryptography, quantum software, and high-scalability quantum computers. Many potential challenges and ...

  6. Quantum computers

    Initialization is an important challenge for nuclear magnetic resonance quantum computers. The first proposals employed pseudo-pure-state techniques, which isolate the signal of an initialized ...

  7. ACM TRANSACTIONS ON QUANTUM COMPUTING Home

    ACM Transactions on Quantum Computing publishes high-impact, original research papers and selected surveys on topics in quantum computing and quantum information science. The journal targets the quantum computer science community with a focus on the theory and practice of quantum computing including but not limited to: models of quantum computing, quantum algorithms and complexity, quantum ...

  8. Quantum Computing: Vision and Challenges

    To better understand quantum computing, this paper examines the foundations and vision based on current research in this area. We discuss cutting-edge de-velopments in quantum computer hardware advancement and subsequent advances in quantum cryptography, quantum software, and high-scalability quantum computers.

  9. Quantum computing: A taxonomy, systematic review and future directions

    Quantum computers, therefore, can access an exponentially large Hilbert space (or computational space), where "n" qubits could be in a superposition state of 2 n possible outcomes at any given time. This will allow quantum computers to tackle large scale space problems. Developing a large-scale quantum computer has its own challenges.

  10. Archives of Quantum Computing: Research Progress and Challenges

    Quantum computing is a revolutionary concept among emerging technologies visioned by researchers. The interdisciplinary nature of quantum computing evolves as cross-pollination of ideas, techniques, and methodologies. Henceforth, a comprehensive analysis of the literature is conducted to insight the progression of quantum computing research. Our study unfurls the intellectual landscape of ...

  11. Transforming Research with Quantum Computing

    The fast advancement of computational quantum research has enormous promise for reshaping the computing landscape (Feynman, 2018).Compared to today's computing machines, although comparable on simple calculations, quantum technology is faster, giving it advantages across multiple sectors useful for society (Gill et al., 2022).In the coming decade, it has the potential to transform defence ...

  12. PDF MIT Open Access Articles Quantum computing

    Quantum computers can calculate and test extensive combinations of hypotheses simultaneously instead of sequentially (S.‐S. Li et al., 2001). Furthermore, some quantum algorithms can be designed in a way that they can solve problems in much fewer steps than their classical counterparts (their complex‐ ity is lower).

  13. Review of Quantum Computing

    Abstract: Quantum computer is a computational framework based on Quantum Mechanism, which is receiving more and more attentions during the last few decades. It has reached remarkable success on some specific tasks comparing with classical computers. In this paper, we summarize the latest progress of this area, including fundamental knowledge, varying state-of-the-art quantum algorithms and ...

  14. An extensive review on quantum computers

    Quantum Computing is a rapidly arising innovation that tackles the laws of quantum mechanics to take care of issues excessively complex for already available classical computers. ... Grassman Space & so on), is a quantum computer" [1]. This paper includes the study of various topics related with quantum computing as well as quantum computers ...

  15. [2201.09877] Quantum Computing 2022

    View a PDF of the paper titled Quantum Computing 2022, by James D. Whitfield and 4 other authors. Quantum technology is full of figurative and literal noise obscuring its promise. In this overview, we will attempt to provide a sober assessment of the promise of quantum technology with a focus on computing. We provide a tour of quantum computing ...

  16. Materials challenges and opportunities for quantum computing ...

    Abstract. Quantum computing hardware technologies have advanced during the past two decades, with the goal of building systems that can solve problems that are intractable on classical computers. The ability to realize large-scale systems depends on major advances in materials science, materials engineering, and new fabrication techniques.

  17. Quantum Computing

    Quantum Computing merges two great scientific revolutions of the 20th century: computer science and quantum physics. Quantum physics is the theoretical basis of the transistor, the laser, and other technologies which enabled the computing revolution. But on the algorithmic level, today's computing machinery still operates on ""classical ...

  18. Quantum Computing

    Collaborate with us. IBM Quantum Network is a community of Fortune 500 companies, academic institutions, startups and national research labs working with IBM to advance quantum computing. We're inventing what's next in quantum research. Explore our recent work, access unique toolkits, and discover the breadth of topics that matter to us.

  19. Quantum computing takes flight

    Quantum computing takes flight. A programmable quantum computer has been reported to outperform the most powerful conventional computers in a specific task — a milestone in computing comparable ...

  20. Overview on Quantum Computing and its Applications in Artificial

    This paper will review the basic building blocks of quantum computing and discuss the main applications in artificial intelligence that can be addressed more efficiently using the quantum computers of today. Artificial intelligence and quantum computing have many features in common. Quantum computing can provide artificial intelligence and machine learning algorithms with speed of training and ...

  21. Toward a code-breaking quantum computer

    The paper's lead author is Seyoon Ragavan, a graduate student in the MIT Department of Electrical Engineering and Computer Science. The research will be presented at the 2024 International Cryptology Conference. ... A quantum computer performs computations using quantum circuits, just like a classical computer uses classical circuits. ...

  22. Power of data in quantum machine learning

    First, motivated by quantum applications in optimization 2,3,4, the power of quantum computing could, in principle, be used to help improve the training process of existing classical models 5,6 ...

  23. Preparing for Quantum Computing

    Quantum computers — which are designed to excel at complex calculations — will be used almost exclusively for research purposes in higher education and government settings to solve problems related to mathematics, physics, and chemistry. ... The multi-university team's paper, "REPQC: Reverse Engineering and Backdooring Hardware ...

  24. PDF A Study on the basics of Quantum Computing

    2.2 Limitations of Classical Computers and birth of art of Quantum Computing 2.2.1 Public key Cryptography and Classical factoring of big integers. 2.2.2 Quantum Factoring 2.2.3 Searching of an item with desired property. 2.2.4 Simulation of quantum system by classical computer. 2.3 Quantum Computing: A whole new concept in Parallelism

  25. Quantum Computing: A New Era of Computer Science

    At the time when Quantum computers will get started, it will become very easy to solve many problems which include the solving of very complex chemical processes with very high accuracy. This research paper will give an overview of the Quantum computers, the introduction of the Quantum Computers, description that how it works, brief knowledge ...

  26. ORNL: Study Seeks to Unite HPC and Quantum Computing for Science

    Our job is to provide better ways to conduct science, and quantum computing can be a tool that serves that purpose." Support for this research came from the DOE Office of Science's Advanced Scientific Computing Research program, the Quantum Science Center, the OLCF, the OLCF's Quantum Computing User Program and the Department of Defense ...

  27. UNM Secures $1M NSF Grant to Pave the Way for Room-Temperature Photonic

    The University of New Mexico (UNM) and New Mexico State University have been awarded a $1 million grant from the National Science Foundation (NSF) to develop a photonic quantum computer capable of operating at room temperature. This project, part of the NSF's National Quantum Virtual Laboratory (NQVL) initiative, is one of five nationwide to receive funding. The research team, led by UNM's ...

  28. Duke-led Team Building Next-Gen 256-Qubit Quantum Computer Funded by

    Researchers at Duke received a $1 million NSF grant to begin building a a 256-qubit quantum computer, with plans to complete by 2026. The project builds on previous quantum research, including the Software-Tailored Architectures for Quantum co-design (STAQ) and Enabling Practical-scale Quantum Computing (EPiQC) initiatives.

  29. Challenges and opportunities in quantum machine learning

    Abstract. At the intersection of machine learning and quantum computing, quantum machine learning has the potential of accelerating data analysis, especially for quantum data, with applications ...