**Ruth Rutter**

CS 441, 13 October 2009

Qubits are the basic unit of quantum computers. While bits can only be 1 or 0, qubits can be 0, 1, or a “superposition” of both at the same time. This is a delicate storage method, and there have been many attempts at developing an ideal quantum computer, including the use of "ions, electrons, superconducting circuits, and photons" [Spectrum IEEE]. However, a practical quantum computer is still only a future to be worked toward.

The fastest algorithm available for classical computers to factor a large number *n* (⌈log *n*⌉ bits) runs in , or exponential time. This becomes prohibitively long for very large values of *n*, which led to the general conclusion that factoring large numbers on a classical computer was impossible for any practical purpose. While it was known that some operations were theoretically much faster on quantum computers, quantum computing had been only of academic interest until Shor developed an algorithm that runs on a quantum computer in . It then performs steps of post-processing on a classic computer. This means that overall, it runs in polynomial time. This provided the impetus for companies and universities to start funding research into developing a true quantum computer.
[imsa.edu]

In 2001, IBM scientists were the first to successfully demonstrate Shor's algorithm on a quantum computer. They used "a billion-billion custom-designed molecules" in liquid to make a seven-qubit quantum computer composed of five fluorine nuclei and 2 carbon-13 nuclei, each with a different nuclear spin. These spins corresponded to the three possible qubit values, and by controlling them with an NMR machine, they used this computer to factor the number 15 to its prime factors, 3 and 5. While this experiment was a large step forward in quantum computing, this method is not easily scalable, and later efforts have and probably will continue to attempt development in other directions. [IBM], [AIP]

In December 2008, two methods of storing photons were reported. Researchers from the Georgia Institute of Technology managed to store quantum information from a photon for 32 milliseconds using super-cooled rubidium atoms in an optical trap. Similar to previous efforts in this direction, this method requires gases chilled to near absolute zero and methods of trapping light unavailable outside of research labs. A few weeks later, researchers at the University of Geneva reported their development of a solid state device that stores photons for up to 1 microsecond. While the length of storage is much less than that of the University of Georgia's storage, it is much more practical, as it relies on a solid-state device and does not require cooling to as near to absolute zero. One suggested use for these types of storage methods is to extend the range of quantum-cryptography networks, which currently cannot extend beyond a few dozen kilometers due to the delicacy of quantum information, and the inability to practically store it for a usable length of time. While the current state of this technology is still short of being truly useful, the Geneva team is currently working on improving it to store data from more than a small percentage of photons for more than one millisecond. Spectrum IEEE

By replacing mirrors and beam splitters with waveguides that weave around the chip and interact to split, reflect, and transmit light through the circuit, researchers at Bristol were able to build a quantum photonic circuit on a silocon chip only millimetres squared [2]. They then injected four photons into the circuit, passing them through a sequence of quantum logic gates to successfully factor the number 15, but not directly. The chip only solved the "computationally hard" part of Shor's algorithm, the "order-finding routine," leaving it to a standard computer to actually compute the numbers. While this small scale computer is nowhere near a full-scale quantum factoring machine, it could be useful in many ways, such as securing communication or simulating quantum systems in physics experiments. [Spectrum IEEE], [Schneier]