Skip to main content

Basic Quantum Research Will Transform Science and Industry

All most people hear about is quantum computing, but that's hardly the whole story

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


The promise of quantum computing seems limitless—faster internet searching, lightning-quick financial data analysis, shorter commutes, better weather prediction, more effective cancer drugs, revolutionary new materials, and more. But we’re not there yet. Focusing on narrow benchmarks, such as how many quantum bits, or qubits, the latest computers have (not many), creates a myopic snapshot of a vast technical landscape. The goal goes beyond faster computers to encompass innovations spread broadly across quantum information science, materials, and technologies, such as quantum sensors—a wide field indeed.

Focusing narrowly on computing won’t accelerate the arrival of quantum supremacy—the tantalizing promise of a future when quantum computers surpass classical computers in computational tasks of practical importance. That will come only from wide-spectrum research and development spanning fundamental quantum mechanics, information science, materials science, computer science, and computer engineering, among other fields.

The best approach puts science first. Solving basic science problems in quantum science across all its complexities will lay the foundation for an array of future technologies and enable transformative scientific and industrial progress. And make no mistake, those technologies will be major drivers of scientific advancement, the economy, and even national security.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Some day.

To say quantum computing is in its infancy is an overstatement. It’s still in the womb. The field is sorting out basic questions about the architectures and technologies for creating and controlling qubits. Qubits are the fundamental processing units of quantum computers, and regardless of the method used to make them, they still don’t maintain their “quantumness” long enough to perform tasks much beyond proof-of-concept computations.

The challenge is built into the inherent weirdness of quantum physics itself, which has puzzled the world’s most brilliant minds for more than 100 years. Basic questions about how particles behave in the subatomic realm—behavior that enables quantum computing—remain unanswered. The eventual answers will fill in huge blank spots in our understanding of the most fundamental workings of the universe.

That’s also what makes quantum science so exciting.

Lacking answers, scientists still debate what makes a quantum computer quantum. Are the conditions of entanglement, superposition, and interference all required? Entanglement is touted as the key ingredient, but that has not been proven. It appears indispensable in some cases, but not in others.

Entanglement occurs when multiple particles can only be described by a global, not an individual, state. This is analogous to reading a book where the individual pages make no sense, but where information emerges once you’re read them all. Normal computers struggle mightily to represent entanglement, which severely limits their ability to simulate quantum systems, like pharmaceutical drugs or superconducting materials. This is one reason we need quantum computers.

Superposition stems from the wave-particle duality of elemental particles in the quantum realm, such as electrons, photons, ions, and atoms. Each is a wave function of probabilities regarding its observable state, such as position, spin, polarization (for a photon), or angular momentum. A particle, or qubit, can occupy many states at once. Those states can be “read” much like reading a classical-computing bit as 0 or 1, but a qubit has many more potential values corresponding to the simultaneous probabilities of being 0 or 1. That property speeds up computation. 

Decoherence, the nemesis of quantum computing, strikes when environmental factors break down the quantum state. Loosely described as “noise,” those factors include entanglement with the external environment or heat.  Measuring the value of a qubit also collapses the wave function, and the qubit has to be set up again, like pressing the “clear” button on a calculator.

Because we still do not fully understand how all this works, large-scale quantum computing will remain elusive as science delves deeper into the quantum world. Science is rooted in theory, which must then be observed by experiment, which then refines theory and generates more experiments. As results solidify, practical applications emerge in technology.

In quantum research, for example, we have already seen that scenario emerge in the case of the no-cloning theorem. Formulated by Wojciech Zurek, of Los Alamos National Laboratory, and William Wooters, formerly of Williams College, in the early 1980s the theorem states that an unknown quantum state cannot be exactly copied. In recent years, Los Alamos has developed a quantum-key distribution device based on this principle for creating hack-proof communications, a major step forward in cybersecurity.

That is one example of how basic science research ultimately spawns technology. Zurek continues his theoretical work in quantum mechanics and is currently studying the breakdown of quantum coherence of space time near a black hole. Such pure-science work does not address a technology challenge, but it might someday shed light on why decoherence cripples the particles making qubits in quantum computers.

Closer to home, even today’s limited quantum computers are a great place to test theory by simulating quantum physics, since they establish exactly the conditions we wish to study. That capability will advance physics in areas stalled by the limits of classical computing and help answer fundamental physics questions.

So, for example, one project at Los Alamos using a cloud-based, publicly available quantum computer seeks to observe how classical behavior—the reliable determinism we observe in our everyday world—emerges from quantum probabilities. The quantum-classical transition remains one of the great unsolved mysteries in science. It has direct bearing on why quantum computers lose coherence and how we can create durable qubits that maintain coherence long enough for extended calculations, paving the way for large-scale quantum computers.

On another front, research into quantum materials is vital to developing robust quantum computers and a constellation of other technologies. Across the board, the ultimate goal is to create particular, controllable quantum states that we can manipulate. That requires isolating qubits from their environment to prevent unwanted entanglement.

The various architectures being explored for quantum computing depend on different ways of creating qubits. Some computers use the states of super-cold trapped ions, while others use superconducting loops. New research is exploring qubits in defects, or voids, on the surface of solids, such as diamond crystals or atomically thin semiconductors. The work seeks to precisely place these defects where they are needed, offering a path toward controllable quantum states, robust qubits, and circuits of qubits in solid materials at room temperature—a challenging but seemingly attainable goal.

Other research exploits superposition to create quantum “atomtronic” (versus electronic) sensors that precisely detect rotations, acceleration, electromagnetic fields, and the like. The next step is exploiting entanglement. Then a particle needs to hit only one atom of the detector to collapse the superposition and—click! —record a measurement. Related research has “painted” matter-wave guides that work like fiber-optic circuits but are much more sensitive. They could be used to create an “atomtronic” gyroscope, which might one day enable navigation independent of GPS systems.

While theory and basic research march forward and tech giants press ahead exploring the fundamental architecture of quantum computers, the key to extracting their full potential will be algorithms, the set of instructions that tell the computer what to do. They must exploit the unique features of quantum computers without succumbing to the inherent tiger traps of the quantum world.  While evidence is highly suggestive, we do not yet know which classes of algorithms may be uniquely enabled by quantum computing.

In work typical of the synergy among quantum theory, quantum information science, and computing, scientists at Los Alamos have adapted algorithms from condensed-matter physics to a new purpose: discovering and developing robust algorithms for the noisy and problematic small-scale quantum computers available today. Related research applies quantum machine learning, an exotic strategy where the quantum computer itself learns to adapt its own algorithms, to perform accurate calculations in spite of its noise.

Quantum computing gets the lion’s share of media attention because of the extravagant claims for its potential: it will render current cybersecurity systems obsolete, it will process vast streams of data in the blink of an eye, it will enable artificial intelligence to surpass human intelligence. Some of that might come true, some might not. Only a firm commitment to broad-based quantum research will tell. In any case, that research will lead to unexpected insights, new solutions to old challenges, and surprising benefits to national economic competitiveness, national security, and everyday life.