Quantum Computing History
- Overview
In 1981, at Argonne National Laboratory, a man named Paul Benioff exploited Max Planck's idea that energy exists in a single unit, and matter exists in a single unit, thus giving rise to quantum computing. Concept has been theorized. Since that year, with new technological advances in quantum theory, the idea of making quantum computers for everyday use has become more concrete.
A quantum machine could one day drive big advances in areas like artificial intelligence and make even the most powerful supercomputers look like toys. Quantum technology is an emerging field of physics and engineering, which relies on the principles of quantum physics. Quantum computing, quantum sensors, quantum cryptography, quantum simulation, quantum metrology and quantum imaging are all examples of quantum technologies, where properties of quantum mechanics, especially quantum entanglement, quantum superposition and quantum tunnelling, are important.
According to John von Neumann, quantum technology is different from the deterministic classical mechanics, which holds that the state is determined by values of two variables. He stated that quantum technology is determined by probabilities and this explanation has been used to justify the technology's superiority.
Quantum computers are making all the headlines these days, but quantum communication technology may actually be closer to practical implementation. The building blocks for these emerging technologies are more or less the same. They both use qubits to encode information - the quantum equivalent to computer bits that can simultaneously be both 1 and 0 thanks to the phenomena of superposition.
And they both rely on entanglement to inextricably link the quantum states of these qubits so that acting on one affects the other. But while building quantum computers capable of outperforming conventional ones on useful problems will require very large networks of qubits, you only need a handful to build useful communication networks.
- A Brief History of the Future
In 1936, Alan Turing proposed the Turing Machine, which became the basic reference point for computing and computer theory. Around the same time, Konrad Zuse invented the Z1 computer, considered the first electromagnetic binary computer. What happened next is history, in our world today, computers are everywhere.
The main catalyst behind this transition is the discovery of silicon and its use in producing high-quality transistors. This happened over a period of 100 years, from Michael Faraday's first recording of the semiconductor effect in 1833, to Morristanenbaum, who made the first silicon transistor at AT&T Bell Labs in 1954, to the first integrated circuit in 1960.
We are about to embark on a similar journey in our quest to build the next generation of computers. In the early 1980s, Richard Feynman, Paul Benioff, and Yuri Manin laid the foundations for an entirely new paradigm of quantum computing, which they believed had the potential to solve problems that "classical computing" couldn't. Thus, quantum computing came into being. Quantum computing can change the world. It could transform medicine, crack encryption and revolutionize communications and artificial intelligence.
Quantum physics, which emerged in the early 20th century, is so powerful yet so different from anything previously known that even its inventors struggled to understand it in detail. Similar to the trajectory that non-quantum communications took over 100 years from discovery to large-scale use, quantum computers are now maturing rapidly. Today, many players are fighting a battle over who can build the first powerful quantum computer.
- A New Kind of Computing
We experience the benefits of classical computing every day. However, there are challenges that today’s systems will never be able to solve. For problems above a certain size and complexity, we don’t have enough computational power on Earth to tackle them.
To stand a chance at solving some of these problems, we need a new kind of computing. Universal quantum computers leverage the quantum mechanical phenomena of superposition and entanglement to create states that scale exponentially with number of qubits, or quantum bits.
Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.