Quantum computing, which harnesses quantum mechanical phenomena to greatly enhance the way in which information is stored and processed, lending itself to performing more efficient algorithms than possible in classical computing, has been an area of ongoing research for more than 30 years. Although physicists and mathematicians were able to theorize three decades ago how a quantum computer could work, scientists and engineers had difficulty building one. In the last five years, we have seen the hardware and software capability move out of university labs and into tangible business products; however, the technology still needs to mature in order for it to become fully enterprise-ready and deliver meaningful, cost-effective business results.
One of the best ways to begin to understand quantum computing is to compare it to classical computing. Although it is not a new model of computation (i.e., it is equivalent to Turing machines or the lambda calculus), the hardware for the two methods operates in very different ways. (See “Quantum 101” sidebar to learn more.) In a classical computer, the basic unit of information is a bit, which can have only one of two values—0 or 1. In a quantum computer, the basic unit of information is known as a quantum bit or “qubit.” Through quantum mechanical phenomena, these qubits can perform many computations simultaneously, which theoretically allows the quantum computer to solve a difficult subset of problems much faster than a classical computer.