Superpositional quantum computations exploit the fact that a coherent
quantum state is a superposition of *n* distinct states, , each
weighted by some complex scalar . Under certain
conditions, this quantum state decoheres, and the particle adopts one
of the as its determinate state, with a probability that is
determined by the ratios of the . The idea proposed in
[Feynman, 1982] and developed in, e.g., [Deutsch, 1985], is that if such
superpositional states were used to implement the states of a
computer, then various registers or memory locations in the computer
would not be conventional bits with a determinate value of 1 or 0, but
would instead be quantum bits - *qubits* - which are
superpositions of both the 0 and 1 values.

A key advantage to this would be that *n* qubits could be used to
perform computations in parallel, one computation for each
combination of values of the superposed states. However, there are
two principal difficulties in exploiting this proposal. First, there
is the problem of maintaining the coherence (superpositionality) of a
qubit while performing computations on it: the danger is that the kind
of physical processes necessary to implement the relevant bit
operations are such that they would cause the quantum state to
collapse or decohere. But even supposing one can
perform such operations while maintaining the coherence of the qubit,
there is still the difficulty of exploiting the superpositionality of
the qubit in a way that can perform effective computational work.

The only specific idea of how this can be done was proposed by Shor
[Shor, 1994]. Shor describes how to initialize a superpositional
quantum state with a random number *x* and a number *n* to be factored
into primes. He then describes how to transform that state into
another which is such that the probability distribution of the
measurements of the state is a simple function of a number *r* which
is a reliable guide to the prime factors of *n*. A few collapses,
then, of this system allows one to calculate *r* and thus factorize
*n*. If this algorithm could be implemented in a real quantum
computational system, one could then produce the prime factors of
large (e.g., 159-digit) numbers in seconds. Since current
cryptography technology relies on the fact that such numbers would
take a single computer many years to factor, Shor's algorithm has
generated much interest in quantum computation. However, it has
proven difficult to generalize this exploitation of the qubit to other
applications. The general problem of how to use a superpositional
state to do computational work remains.

Wed Nov 20 01:10:59 GMT 1996