If you watch the technology headlines you might think something called quantum computing is the Next Big Thing. In January, USA Today declared IBM’s new quantum computer one of the four most “wow worthy” announcements at CES, the annual gadget fest in Las Vegas. Gartner also listed quantum computing as one of the top technology trends for 2019, joining fan favorites like blockchain and virtual reality.
I’ve spent more than 25 years as a physicist researching quantum computers — machines that store and process information on individual atoms or particles, like photons — and I’ve started a company that is building them. I am convinced quantum computing is in fact a breakthrough technology that offers the only known way to attack some of the world’s hardest problems in medicine, transportation, computer security, and other areas we haven’t yet foreseen.
We must be clear, however, about what is and isn’t happening next. The big quantum computing discoveries that will most impact society are still years away. In the meantime, we will see breathless announcements of records broken as the technology rapidly develops. These incremental advances are important for government, which has a role in encouraging this research, as well as for industries that need to start developing ways to use quantum computers as they become more powerful. But too much hype risks disillusionment that may slow the progress.
The first thing to know about quantum computers is that they are not a faster, better version of the computers we have now. You’ll never trade in your laptop or smartphone for a quantum version. Quantum computers almost certainly won’t run social networks, animate Pixar movies, or keep track of airline reservations. They solve different problems in different ways.
Quantum computers were proposed in 1982 by Richard Feynman, the Nobel prize winning physicist, who worried that conventional computers could never tackle problems in quantum mechanics, the well-established theory that predicts the behavior of small isolated particles such as atoms or electrons. Today, we do use conventional computers to simulate quantum models of material and chemical processes, but these simulations grind to a halt when faced with all the possible arrangements of electrons in even a small molecule or chunk of material.
Feynman’s idea was simple: build a computer that stores information on individual particles — later named qubits — that already follow the very rules of quantum mechanics that seem to perplex conventional computers.
What’s the difference? Ordinary computers think in certainties, digitizing every aspect of the world to well-defined numbers. Quantum computers probe all possibilities, constantly updating the probabilities of multiple scenarios. Add more qubits, and they can consider exponentially more scenarios. A quantum computer is programmed to consider all these possibilities and narrow them down to just a few, and then when the output is measured, it can tell us information about all those scenarios. It is critical that a quantum computer not be measured or looked at while it considers the uncountable number of possibilities. For that reason, qubits are like senators before a controversial vote: They shouldn’t reveal their position until they are forced to.
Our world is filled with uncertainty, and quantum computers can be very helpful in selecting the best of several options. Thus a bank wouldn’t use a quantum computer to track checking accounts. When you look at your balance, you want a single answer you can count on. But the bank might use a quantum computer to estimate how much money you will have in your account a year from now, based on the probability you will get a raise or get fired, whether your teenager will crash the car, if the stock market will crash, and how these factors interact.
To be clear, nobody has yet written a program that makes financial projections on a quantum computer. One reason is that, until now, there haven’t been any quantum computers to try them out on. But after a lot of work, that’s changed. Over the last few years, corporate, academic, and government groups have built machines that can isolate and manipulate particles or other types of qubits well enough to handle basic programs.
It takes exacting precision and extreme conditions to isolate and control qubits. Some quantum computers freeze solid-state circuits to close to absolute zero. Others uses electric fields to levitate atoms in a vacuum that is more pure than deep space, while using lasers to manipulate them with an accuracy of 1/10,000 the width of a human hair. These atomic qubits in particular can scale to much larger systems because they are all the same isolated atomic element, perfectly replicable, and they are so well isolated that they never reveal their qubit states until forced to.
In 3-5 years, these machines will perform certain calculations that would not be possible using ordinary computers. But it may be 5-10 years before any of these machines have the capacity and accuracy to solve useful problems. Along the way, I worry that some who read about quantum computing being the next big thing will feel let down and lose interest. We can’t let that happen. Government needs to continue to support basic research, as Congress did passing the National Quantum Initiative Act last year. And the industrial community needs to start working with the current generation of quantum computers so they can develop the know-how and the software that will give them an edge as the technology improves.
Even then, you won’t have a quantum computer on your desk or in your pocket. But you may start to see better drugs, more flexible materials, and organizations running more efficiently. All that will definitely be wow worthy.
Christopher Monroe is the Bice Zorn Professor of Physics and Distinguished Professor at the University of Maryland and co-founder and CEO of IonQ, a quantum computing startup.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.