Article from Katherine D. on Quantum Computing....
semi.org
Future Technology: Quantum Computing by Katherine Derbyshire In 1994, P.W. Shor of AT&T's Research Labs demonstrated that a quantum computer could factor products of large prime numbers in hours, rather than the years required by classical transistor-based computers. The discovery, which could threaten the viability of public key encryption schemes, sparked enormous interest and launched quantum computing into the public eye.
Quantum computing offers potential solutions to problems that are intractable for conventional computers. Actually realizing a quantum computer, however, is fiendishly difficult. While no immediate breakthroughs are apparent, quantum computers may be part of later steps on various roadmaps to achieve ever-higher computational throughput in smaller spaces.
According to Andrew Steane at the University of Oxford, a quantum computer is one that relies on quantum entanglement to store and manipulate information. The mere presence of quantum effects does not create a quantum computer, though. Molecular scale transistors, such as the one demonstrated recently at IBM, still behave as transistors, not quantum devices. While single atoms trapped in quantum dots are the basis of one proposed implementation, quantum dots by themselves do not perform quantum computation.
To understand what a quantum computer is, it helps to look at one of the simplest quantum systems, a hydrogen atom.
Hydrogen Qubits
Suppose that a single hydrogen atom is completely isolated from other atoms. The atom's one electron behaves as a wave, and can be described by a probability distribution surrounding the atomic nucleus. The electron could be anywhere within that distribution at any particular instant. The electron is said to be in a superposition of all of its possible states. In particular, the electron spin is in a superposition of the possible spin up and spin down states. In isolation, both spin states are equally likely. We can describe this situation with the shorthand notation |0>+|1>.
Actually measuring the spin localizes the electron within its probability distribution, forcing it to behave like a particle. The measurement breaks the superposition of states, giving us either |0> or |1> as a result.
A system might have many such isolated hydrogen atoms, each of whose electrons might be in a similar superposition of states. A single measurement might collapse these superpositions (known as qubits) into a series of up and down spins (|0>,|1>,|1>,|0>, and so forth). Manipulating the state of one qubit leaves the others untouched. This system is not a quantum computer because the qubits do not interact with each other.
Next, consider a two-atom hydrogen molecule. This molecule includes two electrons, each in a superposition of the spin up and spin down states, with overlapping probability distributions. We can describe the combined system with the notation |00>+|11>, and say that the two qubits are entangled. Applying a magnetic field to one physical region of space changes the probability distribution in that region. Both electrons are affected because both of them include the affected region in their superposition of states.
The secret to quantum computing lies in the difference between manipulation and measurement of qubits. Manipulation changes the probability of each state in the superposition, while leaving the superposition itself intact. Measurements, on the other hand, break the superposition of states by localizing the qubits within their probability distributions. Mathematically, manipulations are "unitary transformations" of the system, while measurements are "projections." A quantum computer works by applying a succession of unitary transformations to one or more qubits, then applying a projection to read out the results at the end.
Fast Factorization
At this point, practical readers are probably wondering why anyone would go to all that trouble instead of just using transistors. What can a quantum computer do that a classical computer cannot? The answer lies in the entanglement of qubits.
For example, consider a many-body problem such as calculating the effects of ion bombardment on a surface. Classical algorithms represent the system with a set of n vectors, one for each variable of interest. The resources required by such algorithms increase as 2n, doubling for each additional variable.
A quantum computer, in contrast, can be viewed as a many-body simulator. The superposition of states that makes up a qubit is itself a set of vectors. Interactions in the system being studied can be modeled directly as transformations of qubits. The number of possible states doubles for each qubit added to the system. Accordingly, the resources required for a many-body problem increase only linearly with n, the number of particles. According to David DiVincenzo of IBM's T.J. Watson Research Center, building such simulators may be feasible with a few tens of qubits.
Though the statement sounds circular, quantum computers are also extremely useful for studying quantum computers. Understanding the effects of transformations of entangled states gives physicists important information about the structure of all matter. Quantum computing ties together quantum mechanics and information theory, arguably the two most important scientific developments of the 20th century. Useful experiments can be done with systems of less than ten qubits, DiVincenzo said.
The third use of quantum computing, and the one that has provoked such tremendous interest, is the possibility of new algorithms for problems that cannot be solved by classical computing. Shor's algorithm for factoring polynomials is one of the most commonly discussed examples. Such algorithms require functioning quantum computers able to manipulate thousands of qubits.
Public key encryption schemes are generally considered to be secure because deducing the private key requires factoring a product of two large prime numbers. Factoring large numbers, usually by a brute force search, requires exponentially more resources as the key length increases. Large keys are, for practical purposes, "unbreakable," with solutions requiring many years of computer time and millions of dollars worth of computer hardware.1
However, Shor's algorithm relies on the fact that for any composite number N there is a periodic function f(k) for which f(k) = yk mod(N) for some unknown k, where y is a chosen positive integer less than N but not a factor of N. The powers of y can be used to identify and test possible factors of N. It's necessary to evaluate on the order of N powers of y, however. With a classical computer, this approach does not offer any performance improvement.
A quantum computer, as Shor demonstrated, evaluates all the powers of y simultaneously. Shor's algorithm takes advantage of the periodicity of f(k), transforming the cyclic behavior of that function into enhanced probabilities for certain quantum states. Those increased probabilities are then read out to give the result.
According to a 1997 analysis done by Andrew Steane in "Quantum Computing,"2 a quantum computer with a switching rate of 1 MHz would require seven hours to factor a 130-digit number. Factoring such a number is at the limit of what classical computers can tackle. Doubling the number of digits to 260 would put the problem completely out of reach of classical computers, while an ideal quantum computer would take only eight times as long.
This is not to imply that quantum computers will replace classical computers. Preliminary research seems to indicate that quantum computers offer advantages only for a very limited collection of problems. The difference between the two is analogous to the difference between classical and quantum mechanics. In the macroscopic limit, the classical approximation is accurate enough to give good results. The classical approximation is also much faster in most situations.
Designing a Quantum Computer
One of the reasons why classical computers will probably always have faster cycle times than quantum computers is the difficulty involved in actually performing quantum transformations. Manipulating and measuring the spin states of individual atoms or electrons is hard. Though the theoretical architecture of quantum computing is advancing rapidly, researchers are only beginning to devise appropriate experimental tests. Several different implementations of individual qubits exist. Some researchers have even been able to assemble systems of several qubits.
IBM's DiVincenzo has proposed five requirements for any physical implementation of a quantum computer:
First – A scalable physical system with well characterized qubits. As discussed above, a qubit is simply a two-level quantum system. Examples include the spin states of an electron or atomic nucleus, or two energy states of an atom. Well-characterized qubits will have well-known physical parameters and well-defined interactions with both neighboring qubits and the external fields used for manipulation.
Second – The ability to initialize the state of the qubits. Clearing a quantum "register" at the beginning of a calculation is equivalent to creating a state of the form |00000…>, with as many zeroes as there are qubits. How this is accomplished depends on the system. A magnetic field can be used to align all spins in a given direction. Cooling can bring ions to their ground excitation state.
Initialization is one of several factors today that tend to confine quantum computers to laboratory settings. Proposed initialization hardware involves such features as liquid helium cooling baths, laser-cooled ion traps and multi-Tesla magnets. None of these is likely to be practical for desktop use.
Third – Decoherence times much longer than the gate operation time. So far, we've ignored decoherence. Unfortunately, it's one of the most serious obstacles to any physical implementation of qubits. Decoherence is coupling between the qubit and its environment. It can occur due to collisions with atoms, stray photons, or almost any other interaction. DiVincenzo described decoherence as the emergence of classical behavior: the qubit ceases to behave as an independently manipulable entity. Steane described it as a measurement of the system, forcing the qubit to assume one possibility from its superposition of states.
Isolating a qubit from its environment indefinitely is impossible. The goal is to make the decoherence time long enough so that calculations can be performed before the system collapses. For complicated calculations, this is likely to be an unworkably stringent requirement.
Fortunately, quantum error correction turns out to be possible. Extra qubits, not involved in the calculation, can be used to detect and fix decoherence-induced errors much as ECC memory fixes broken bits today. These procedures require a continuous source of initialized qubits.
Fourth – A universal set of quantum gates. In classical computing, a gate is a logical operation such as AND, OR or NOT. Each gate, made up of several transistors, delivers a given output for each possible input. Binary logic gates take two input bits, each of which can be 0 or 1, and return one output bit, which also can be 0 or 1. For example, the AND gate returns 1 only if both of the input bits are 1.
There are eight possible combinations of two input and one output bits. Of these, most can be constructed from a chain of other gates. For binary logic, it turns out that only the NAND (NOT AND, see table) gate is essential.
Quantum gates, like their classical analogs, are the building blocks of computation. However, quantum gates must be implemented in a way that depends only on unitary manipulations of the initial qubits. In particular, quantum operations must be reversible.
The set of potential quantum gates is infinite, corresponding to the set of all unitary transformations. Not all of these will be realizable in a particular physical system, however. In many proposed systems, only some of the interactions between neighboring qubits can be turned on and off.
A. Barenco and coworkers showed that a full complement of quantum logic operations can be constructed from operations on one qubit (such as NOT), plus the controlled NOT (cNOT, see Fig. 1) gate.3 The cNOT gate applies NOT to a target qubit only if the control qubit is in state |1>. Their result greatly reduces the required gate vocabulary. Still, not all proposed quantum computer designs can implement the cNOT gate or its equivalent.
Fifth – Ability to measure specific qubits. Finally, all this effort is useless without a way to read the result of a computation. It must be possible to measure the state of a specific qubit, independent of its neighbors and without changing the state of the rest of the system. Measuring, for example, the spin state of a single electron, is not easy. One solution, used by nuclear magnetic resonance (NMR)-based qubits, is to run several copies of a given calculation simultaneously. The result is then an average over the whole ensemble. This kind of measurement is well established in condensed matter physics, so it's not surprising that the largest qubit systems reported so far use this approach.
Building a Quantum Computer
NMR, used by chemists to study complex molecules, depends on nuclear spins. Normally, opposing spins are present in equal numbers. An applied magnetic field encourages spins to align parallel to it, creating a tiny imbalance between the parallel and anti-parallel states. Magnetic pulses can flip spins between states. One nuclear spin therefore constitutes one qubit. Nuclei within a molecule affect each other, so a molecule with several nuclear spins can form an array of interacting qubits: a quantum gate.
Instead of trying to control the behavior of a single cryogenically cooled molecule, NMR acts on a test tube full of the appropriate liquid. For small systems, this built-in redundancy avoids data loss due to decoherence. Operating large systems at room temperature is difficult, however. The even thermal balance between spin up and spin down states makes initialization difficult. Thermal noise becomes more of a problem as the number of spins increases. Cooling to cryogenic temperatures doesn't help, either. If the liquid qubits freeze, then solid-state behavior dominates the system.
According to DiVincenzo, researchers at IBM have produced NMR-based systems of up to seven qubits. Still, he believes that NMR designs are unlikely to scale to the large numbers of qubits needed for computations.
Qubits using solid-state spin structures are promising, DiVincenzo said. Researchers have not yet demonstrated solid-state qubits, but are making steady progress on necessary materials work. A group at the University of New South Wales (Australia), for example, has demonstrated placement of single dopant atoms using scanning tunneling microscopy.
Josephson junctions also offer interesting possibilities. A Josephson junction is made by sandwiching a non-superconducting layer between two pieces of superconducting material. A SQUID (superconducting quantum interference device) – a loop with two Josephson junctions – can detect single magnetic flux quanta through the loop. Thus, the SQUID can be set into a 0 or 1 state and switched by a magnetic flux.
Future Outlook
While significant progress has been made in the last few years, seven qubits is still a long way from practical applications of quantum computing. According to DiVincenzo, a recent poll found that researchers in the field don't expect to see functioning computers for 20 or 30 years. Along the way, potential designs will have to reach several important milestones.
First, there is so far no qubit equivalent to a single transistor. All of quantum computing depends on a stable, noise-free qubit that can be manipulated as needed.
Once qubits exist, further research requires two-qubit cNOT gates (or the equivalent). It must be possible to create such gates reliably and repeatably.
From gates, the next step is to scale up to large enough numbers of gates for real computations. As the NMR case shows, large systems suffer from progressively more serious problems with initialization and control.
These milestones will require significant research in a number of areas. New materials techniques are especially critical, DiVincenzo said, as solid-state qubits require single atom impurity control. Spin physics is not yet well understood, particularly relative to the charge physics used by more familiar devices.
To summarize, quantum computers are unlikely to replace binary logic in the near, or distant, future, except for a small set of specialized problems. Still, the research they inspire could have profound effects on physics and on all electronic devices.
About the Author
Katherine Derbyshire is currently writing a major market study on sub-0.25-micron lithography. She recently founded Thin Film Manufacturing, a consulting firm helping the industry manage the interaction between business forces and technology advances. Previously, she was managing editor of Semiconductor Online from 1998 to 2001, and chief technical editor of Solid State Technology from 1994 to 1998. She may be reached at kderbyshire@thinfilmmfg.com. |