Personal tools

Quantum Computing History

Salem_MA_IMG_0573
(Salem, Massachusetts - Harvard Taiwan Student Association)
   

- Overview

In 1981, at Argonne National Laboratory, a man named Paul Benioff exploited Max Planck's idea that energy exists in a single unit, and matter exists in a single unit, thus giving rise to quantum computing. Concept has been theorized. Since that year, with new technological advances in quantum theory, the idea of making quantum computers for everyday use has become more concrete. 

A quantum machine could one day drive big advances in areas like artificial intelligence and make even the most powerful supercomputers look like toys. Quantum technology is an emerging field of physics and engineering, which relies on the principles of quantum physics. Quantum computing, quantum sensors, quantum cryptography, quantum simulation, quantum metrology and quantum imaging are all examples of quantum technologies, where properties of quantum mechanics, especially quantum entanglement, quantum superposition and quantum tunnelling, are important. 

According to John von Neumann, quantum technology is different from the deterministic classical mechanics, which holds that the state is determined by values of two variables. He stated that quantum technology is determined by probabilities and this explanation has been used to justify the technology's superiority.

Quantum computers are making all the headlines these days, but quantum communication technology may actually be closer to practical implementation. The building blocks for these emerging technologies are more or less the same. They both use qubits to encode information - the quantum equivalent to computer bits that can simultaneously be both 1 and 0 thanks to the phenomena of superposition. 

And they both rely on entanglement to inextricably link the quantum states of these qubits so that acting on one affects the other. But while building quantum computers capable of outperforming conventional ones on useful problems will require very large networks of qubits, you only need a handful to build useful communication networks. 

 

- A Brief History of the Future

In 1936, Alan Turing proposed the Turing Machine, which became the basic reference point for computing and computer theory. Around the same time, Konrad Zuse invented the Z1 computer, considered the first electromagnetic binary computer. What happened next is history, in our world today, computers are everywhere. 

The main catalyst behind this transition is the discovery of silicon and its use in producing high-quality transistors. This happened over a period of 100 years, from Michael Faraday's first recording of the semiconductor effect in 1833, to Morristanenbaum, who made the first silicon transistor at AT&T Bell Labs in 1954, to the first integrated circuit in 1960. 

We are about to embark on a similar journey in our quest to build the next generation of computers. In the early 1980s, Richard Feynman, Paul Benioff, and Yuri Manin laid the foundations for an entirely new paradigm of quantum computing, which they believed had the potential to solve problems that "classical computing" couldn't. Thus, quantum computing came into being. Quantum computing can change the world. It could transform medicine, crack encryption and revolutionize communications and artificial intelligence. 

Quantum physics, which emerged in the early 20th century, is so powerful yet so different from anything previously known that even its inventors struggled to understand it in detail. Similar to the trajectory that non-quantum communications took over 100 years from discovery to large-scale use, quantum computers are now maturing rapidly. Today, many players are fighting a battle over who can build the first powerful quantum computer.

 

- A New Kind of Computing 

We experience the benefits of classical computing every day. However, there are challenges that today’s systems will never be able to solve. For problems above a certain size and complexity, we don’t have enough computational power on Earth to tackle them. 

To stand a chance at solving some of these problems, we need a new kind of computing. Universal quantum computers leverage the quantum mechanical phenomena of superposition and entanglement to create states that scale exponentially with number of qubits, or quantum bits.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

 

- How Do Quantum Computers Work?

Quantum computers perform calculations based on the probability of an object's state before it is measured - instead of just 1s or 0s - which means they have the potential to process exponentially more data compared to classical computers. Classical computers carry out logical operations using the definite position of a physical state. These are usually binary, meaning its operations are based on one of two positions. A single state - such as on or off, up or down, 1 or 0 - is called a bit.  

In quantum computing, operations instead use the quantum state of an object to produce what's known as a qubit. These states are the undefined properties of an object before they've been detected, such as the spin of an electron or the polarisation of a photon. Rather than having a clear position, unmeasured quantum states occur in a mixed 'superposition', not unlike a coin spinning through the air before it lands in your hand. These superpositions can be entangled with those of other objects, meaning their final outcomes will be mathematically related even if we don't know yet what they are.  

The complex mathematics behind these unsettled states of entangled 'spinning coins' can be plugged into special algorithms to make short work of problems that would take a classical computer a long time to work out... if they could ever calculate them at all. Such algorithms would be useful in solving complex mathematical problems, producing hard-to-break security codes, or predicting multiple particle interactions in chemical reactions.

 

Mittenwald_Germany_060422A
[Mittenwald, Bavaria, Germany]

- Potential Strengths

Quantum computers certainly have potential. In theory, they can solve problems that classical computers cannot handle at all, at least in any realistic time frame. Take factorization. Finding prime factors for a given integer can be very time consuming, and the bigger the integer gets, the longer it takes. Indeed, the sheer effort required is part of what keeps encrypted data secure, since decoding the encrypted information requires one to know a “key” based on the prime factors of a very large integer. 

In 2009, a dozen researchers and several hundred classical computers took two years to factorize a 768-bit (232-digit) number used as a key for data encryption. The next number on the list of keys consists of 1024 bits (309 digits), and it still has not been factorized, despite a decade of improvements in computing power. A quantum computer, in contrast, could factorize that number in a fraction of a second – at least in principle.

Other scientific problems also defy classical approaches. A chemist, for example, might know the reactants and products of a certain chemical reaction, but not the states in between, when molecules are joining or splitting up and their electrons are in the process of entangling with each other. 

Identifying these transition states might reveal useful information about how much energy is needed to trigger the reaction, or how much a catalyst might be able to lower that threshold – something that is particularly important for reactions with industrial applications. The trouble is that there can be a lot of electronic combinations. 

To fully model a reaction involving 10 electrons, each of which has (according to quantum mechanics) two possible spin states, a computer would need to keep track of 210 = 1024 possible states. A mere 50 electrons would generate more than a quadrillion possible states. Get up to 300 electrons, and you have more possible states than there are atoms in the visible universe.  

Classical computers struggle with tasks like these because the bits of information they process can only take definite values of zero or one (1 or 0), and therefore can only represent individual states. In the worst case, therefore, states have to be worked through one by one. By contrast, quantum bits, or qubits, do not take a definite value until they are measured; before then, they exist in a strange state between zero and one, and their values are influenced by whatever their neighbours are doing. In this way, even a small number of qubits can collectively represent a huge “superposition” of possible states for a system of particles, making even the most onerous calculations possible.

 

- How Quantum Computing Transform the Future

In October 2022, the Royal Swedish Academy of Sciences awarded the Nobel Prize in Physics to three scientists, Alan Aspect, John Clauser and Anton Zeilinger, in recognition of their research in quantum information science. Recently, a consortium of researchers from Caltech, Google, Fermilab, MIT, and Harvard used Google's Sycamore quantum processor to generate and control the equivalent of an Einstein-Rosen bridge or, more commonly, a bug Hole stuff, to the delight of Star Trek fans everywhere. 

Several companies are currently working on developing quantum computers or aspects of quantum computing, including Amazon (AMZN), AMD (AMD), Baidu (BIDU), IBM, Google, Honeywell (HON), Intel (INTC), Microsoft (MSFT), Quantum Computing (QUBT) and Toshiba (TOSBF), as well as private companies such as D-Wave Systems, Atom Computing, QC Ware and PASQAL. 

In 2021, IonQ (IONQ) will become the first quantum technology startup in history and the first pure quantum computing company to go public. In March 2022, Rigetti Computing (RGTI) went public on Nasdaq following its merger with special purpose acquisition company (SPAC) Supernova.

These companies are using a variety of approaches to build quantum computers, including superconducting qubits, trapped ion qubits, and photonic qubits. While quantum computers are still in the early stages of development, many believe they have the potential to revolutionize fields such as medicine, finance and materials science by providing faster and more powerful ways to solve complex problems. 

For example, as the world moves further towards renewable energy sources, the need for energy storage in the form of batteries is increasing. In order to improve batteries, we need to be able to test them, and battery simulations can be done much faster than actual testing, accelerating innovation. Their computing power will enable machine learning, making the kind of artificial intelligence (AI) we see in sci-fi movies a reality.

One problem quantum computing poses involves cybersecurity, which today is largely based on math-based cryptography. Today's systems work because the mathematical problems that provide protection are so complex that conventional computers cannot solve them in useful time. The "quantum threat" is that quantum computers will render these security systems obsolete. Some believe this could become a reality sometime between 2025 and 2030. Arqit Quantum (ARQQ) hopes to solve this problem by offering a quantum encryption platform as a service.

 

 

Document Actions