Skip to main content

Quantum of science: Exploring little particles with big promise

D-Wave's 2000Q quantum computer. Image credit: D-Wave Systems

You could say that the quantum science revolution began with a lightbulb, or at least the challenge of making a more efficient one.

The 19th century was closing down, Thomas Edison had patented a longer-lasting incandescent bulb, and each of the world’s industrial powers wanted to be at the forefront of this rapidly expanding technology.

Unfortunately for them, they weren’t quite sure how a lightbulb worked. They knew, of course, that any object will glow if it gets hot enough—whether it be a lightbulb filament or a fire iron—and will cycle through a well-known sequence of colors as its temperature rises, from red to orange to yellow and, eventually, blue-white. But they could not divine the relationship between temperature and color, especially at higher frequencies. As a result, scientists and inventors were hard-pressed to design a bulb that maximized the light it emitted while minimizing the amount of electricity it required.

Planck starts the quantum ball rolling

The answer came from German physicist Max Planck. In 1900, Planck developed a theoretical foundation for the relationship between temperature and color using what he assumed was nothing more than a mathematical trick. It was commonly understood at the time that light traveled in waves rather than particles, yet to come up with his answer, Planck was forced to assume that light was emitted in little packets.

With this approach, he launched the field of quantum mechanics, though he didn’t realize it at the time. Instead, he and his colleagues saw his approach as a stopgap that would soon yield to the real answer.

Einstein jumps in

The next step in the evolution of quantum science came from Albert Einstein. In 1905—the same year he introduced both special relativity and science’s most famous equation, E=mc2—Einstein published a paper suggesting that Planck’s insight was more than mathematical sleight of hand and that light was indeed absorbed and emitted in little packets, or quanta. These quanta would come to be known as photons.

It took several years, but the scientific community eventually accepted Einstein’s light quanta and the implications that came with it. As a result, Planck, Einstein and colleagues would uncover a new set of rules for matter at the scale of atoms and smaller.

At this scale—the nanoscale—the world becomes very strange:

  • Particles behave like waves.
  • It’s impossible to know both the exact position of a particle and its momentum at any given time.
  • A particle can be in two independent states at the same time.
  • Particles can become entangled so that a change to one will affect others, even if they’re hundreds of miles apart.

As one of the field’s pioneers, Niels Bohr, put it, “Anyone who is not shocked by quantum theory has not understood it.”

Quantum mechanics in the 21st century

For all its strangeness—or perhaps because of it—quantum mechanics holds enormous technological promise. Computers built on the unique rules of quantum mechanics have the potential to solve problems that are literally unsolvable on even the most powerful traditional computers. Quantum key distribution seems poised to bolster information security, and materials built on the interactions of quantum particles form the basis of extremely sensitive detectors.

Governments are recognizing the value of quantum science as well. On Dec. 21, President Trump signed the National Quantum Initiative Act, which calls for a 10-year plan and $1.25 billion to beef up the country’s efforts in quantum information science and technology.

Introducing the qubit

Quantum computers are built on qubits (pronounced CUE-bits), which are both analogous to the bits that control your home computer and fundamentally different. While bits can have only one of two values—typically expressed as 0 or 1—qubits can be 0, 1, or a combination of both. So while a bit can have one of only two values, a qubit is in one of essentially an infinite number of values.

Another odd characteristic of qubits and other quantum particles is that they can be entangled. Entanglement can be useful for a researcher performing a quantum calculation, but it can be a wonder for secure communications.

“Entanglement is monogamous,” explained ORNL Quantum Communications Team lead Nicholas Peters. “If entanglement is maximal between two qubits, then any third qubit cannot be correlated to the first two. This property allows us to perform quantum key distribution without a third party being able to guess the key, which enables secure communications.”

Finally, qubits and other quantum particles are very, very sensitive to their environment. For a computer scientist, this is a real challenge, because anything from a solar storm to an insufficiently chilled component can destroy your calculation. But for the development of sensors, this can be a boon.

“Any kind of disturbance can knock these guys out of states that they need to be in,” ORNL materials scientist Stephen Jesse said. “Because of that, they might be really great sensors. Certainly, for very weak electromagnetic radiation—maybe really weak microwave signals that come from deep space, maybe gravitational waves.”

The promise of quantum computing …

The promise of quantum computing is especially exciting to scientists who are bound by the limits of traditional computers, even supercomputers as powerful as ORNL’s Summit system, the world’s most powerful for the past year.

“Classical computing is defined by Boolean algebra and the ability to add numbers—and do it in zeroes and ones,” explained David Dean, ORNL’s associate laboratory director for physical sciences. “But there are certain types of problems that can’t be solved on a classical computer.”

… and the challenge

To get to quantum computers that can solve these problems, however, scientists and engineers will have to overcome enormous challenges—from the fragility of quantum processors that can be rendered useless by something as seemingly inconsequential as an inadequately chilled wire to the fundamental uncertainty of any quantum system.

“The computers we have today are perfect—they rarely mess up,” Dean said. "When you add one plus one, you will always get two. The gate fidelity in a classical computer is more than 99.99999 percent correct.

“On the other hand, a single qubit gate—which takes two qubits—is probably at 99.999 percent today. But the minute you have more than a couple of qubits, you get errors. And it’s natural, it’s inherent, because quantum mechanics is not a definitive—it allows for probabilistic distributions of the particles in some ways.”

Computer science will have to adapt

Quantum information science will be fundamentally different not only for the machines, but for the people as well, explained ORNL quantum computer scientist Travis Humble. In fact, computer scientists will have to add a new area of expertise if they expect to work in the field.

“Even if these computers overcome the technical challenge, who’s going to use them?” Humble said. “Right now, you have to understand what a qubit is, you have to understand how to program those types of interactions between qubits, and you have to understand the algorithms, which are completely new. If we build these systems and nobody uses them, then it was all for naught.  

"We need the physicists to figure out what are the good materials to use for quantum computers. We need the engineers to actually build the devices that can scale up and have low noise levels, and we need the computer scientists to use the devices to solve real problems.”