Answer Geek: What Is a Quantum Computer?
-- Q U E S T I O N: I’ve been hearing a lot about quantum technology and the possibility of developing a quantum computer thousands of times more powerful than any computer currently in existence. Could you describe, in layman’s terms, what a quantum computer is (or will be)?
— Jeff Byrne
A N S W E R: Oh boy. Here we go. Quantum computers. Answer Geeks hate questions about quantum computers. You want to know why? Because that means delving into the inexplicable world of quantum mechanics, where the normal laws of physics aren’t applicable, a single object can be said to exist in many states — every state — at once, and all is counterintuitive. Give us poor Answer Geeks gears and pistons. Or bits and bytes. Anything where objects act on each other in a predictable ways and cause and effect is easy to understand. But not quantum computers.
Atomic Power
Instead of the simple yes-or-no paradigm of conventional computers — where 1 is 1, 0 is 0 and everything gets crunched through a bunch of silicon switches in a nice, linear fashion — quantum computers use the fundamental building blocks of atoms themselves, subatomic particles such as electrons, protons and photons, and exploit the bizarre quantum rules that govern their behavior. These particles spin, so if the spin is one direction — up, for example — that could be the equivalent of the 1 in a conventional computer, while a particle with a down spin could be a 0. Simple so far.
According to the laws of quantum physics, though, there is no way to know with certainty whether a particle has an up spin or a down spin or something in between, so it is said to possess all of those properties all at the same time. This is called superposition, and it turns out to have very interesting ramifications for certain kinds computing, because while the bit in a conventional computer must equal either 1 or 0, a quantum bit — or “qubit” — can be both 1 and 0 at the same time.
The great thing about this is that as you add in qubits, the number of possibilities that are represented grow exponentially. Two qubits can exist simultaneously as the four possible two-bit numbers (00, 01, 10, and 11). With three qubits, you can represent all of the possible combinations of a three-bit number (000, 010, 011, 100, 101, 110 and 101) all at the same time. String together 40 qubits and you have the binary representation for every number up to and beyond a trillion. All at the same time. And you can perform operations on every one of those different numbers. All at the same time. So what you have is massive parallel processing.
That’s great for jobs like factoring, which is easy enough for small number (remember this? — 20 = 2 x 2 x 5). But if the number get to be large enough, factoring requires huge amounts of computing power. So how good is a quantum computer at this sort of thing? In theory, it could factor a 400-digit in about a year. If that doesn’t sound impressive to you, consider this — it would take the fastest supercomputer in existence more than a billion years to accomplish the same feat.
Not There Yet
But there are some huge problems to overcome before anyone builds a quantum computer that can do that, however. One of the biggest issues is that quantum systems are extremely delicate, and any interaction with the surrounding environment will collapse the highly desirable state of superposition down to a single definite state. That interaction could be as simple as one atom bumping up against another. So if you set out to build a quantum computer, you face a quandary: How can you protect qubits from outside interference, but leave yourself a way to make them perform calculations and then read out the answer? Another problem is that the nature of the way particles spin within atoms means that finding, or creating, molecules that contain more than a handful of qubits is quite tricky.
Despite the difficulties, a number of important breakthroughs have been achieved in recent years. A decade ago, most scientists thought that quantum computing was a cool theoretical idea that would never actually work in practice. In 1994, a researcher named Peter Shor came up with an algorithm that would allow a quantum computer to factor large numbers quickly. This raised a great deal of interest because most encryption systems are based on the difficulty involved in doing just that. The fact that quantum computers could be used to crack even the most sophisticated security and encryption systems made a lot of people take the idea more seriously.
In 1996, another breakthrough occurred when researchers realized that nuclear magnetic resonance could be used to both manipulate qubits and then observe the results, all without collapsing their superposition. NMR is the same technology that is in hospital MRI machines to take images of soft tissue. That led to a working quantum computer, which contained two qubits and was able to solve something called the Grover search algorithm, a classical bit of number theory that is analogous to placing a prize hidden behind one of four doors and then finding it while opening as few doors as possible. On average, a conventional computer must open slightly better than two doors to find the prize, because it has to open each one in succession. The quantum computer finds the prize with just one operation because it can open all four doors at once.
Saving Moore’s Law
Progress continues. This past August, IBM announced a 5-qubit quantum computer that used a specially designed molecule consisting of five fluorine atoms held in a small glass tube. Radio frequency pulses from an NMR machine allowed the scientists to “program” the molecule, while the imaging capabilities of the machine let them “see” the results.
It will be a while before quantum computing becomes more than a subject of laboratory research. Scientists say they’ll have to be able to build a quantum computer with dozens of qubits before they’ll be able to tackle real-world computing problems more efficiently than a conventional computer, and so far it looks like 10-qubits is the limit with current NMR techniques.
If they ever figure out how to go beyond 10 qubits, there’s still not much chance you’ll settle in at your desk and fire up your very own personal quantum computer. The kind of massively parallel processing that quantum computing excels at isn’t all that useful when it comes to many of the day-to-day tasks most of us use a computer for, such as word processing. What it is really good for, however, is jobs like finding items in a database. When researchers involved with quantum computing look ahead to the day when it is commercially viable, many of them picture co-processors that handle specific computing tasks such as database lookup, in much the same way that today’s computers use math co-processors and graphic cards to handle certain tasks.
As far fetched as quantum computing sounds, there is a certain air of inevitability to its appearance. The reason: Moore’s Law, which says computer chips will decrease in size by a factor of two every couple of years, doubling processing speeds. The problem is that by about 2020, we’ll reach the point where 1 bit is represented by 1 atom and there will be no way to make a conventional computer chip any smaller or any faster. If things go well, quantum computing will step in to pick up the slack and save Moore’s Law just in the nick of time.
Todd Campbell is a writer and Internet consultant living in Seattle. The Answer Geek appears weekly, usually on Thursdays.