Quantum Machine Learning and Quantum Algorithms
- Overview
In quantum computing, a quantum algorithm is an algorithm that runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation.
Quantum algorithms are usually described, in the commonly used circuit model of quantum computation, by a quantum circuit that acts on some input qubits and terminates with a measurement. A quantum circuit consists of simple quantum gates, each of which acts on some finite number of qubits.
Quantum algorithms are step-by-step procedures that run on a quantum computer to solve problems. They use quantum features like quantum superposition and entanglement, which can't be efficiently simulated on classical computers. This means that quantum algorithms can sometimes solve problems faster than classical algorithms.
Here are some examples of quantum algorithms:
- Shor's algorithm: A famous algorithm that can factor integers in polynomial time.
- Variational Quantum Eigensolver (VQE): A variational algorithm that uses a classical computer to optimize parameters of a quantum computation.
- Deutsch-Jozsa algorithm: An early example of a quantum algorithm that is exponentially faster than any classical algorithm.
- Quantum unconstrained binary optimization (QUBO): An algorithm used to solve optimization problems like the traveling salesman.
- Bernstein-Vazirani algorithm: An algorithm that uses quantum parallelism to evaluate multiple input values of a function simultaneously.
- Simon's algorithm: An algorithm that is exponentially faster than any classical algorithm and has many applications in cryptanalysis.
- Grover's algorithm: A quantum search algorithm that finds the location of a marked item in an unsorted database in O(√N) time.
Please refer to the following for more information:
- Wikipedia: Quantum Algorithm
- Quantum Machine Learning
Quantum machine learning is the integration of quantum algorithms in machine learning programs. The most common usage of the term refers to machine learning algorithms used to analyze classical data executed on quantum computers, known as quantum-enhanced machine learning.
While machine learning algorithms are used to compute large amounts of data, quantum machine learning utilizes qubits and quantum operations or specialized quantum systems to increase the speed of computation and storage of data done by algorithms in programs. This includes hybrid approaches involving classical and quantum processing, where computationally difficult subroutines are outsourced to quantum devices. These routines may be more complex in nature and execute faster on a quantum computer. Furthermore, quantum algorithms can be used to analyze quantum states rather than classical data.
In addition to quantum computing, the term "quantum machine learning" is also associated with classical machine learning methods (i.e. machine learning of quantum systems) applied to data produced by quantum experiments, such as learning the phase transitions of quantum systems or creating new quantum Experiments. Quantum machine learning also extends into a branch of research that explores methodological and structural similarities between certain physical systems and learning systems, especially neural networks.
For example, some mathematical and numerical techniques in quantum physics apply to classical deep learning and vice versa. In addition, the researchers looked at a more abstract concept of learning theory about quantum information, sometimes referred to as "quantum learning theory."
- Quantum Algorithms
In quantum computing, a quantum algorithm is an algorithm that operates on a real quantum computing model, the most commonly used model being the quantum circuit model of computation. A classical (or non-quantum) algorithm is a finite sequence of instructions, or a step-by-step process for solving a problem, where each step or instruction can be executed on a classical computer.
Likewise, quantum algorithms are a step-by-step process where each step can be executed on a quantum computer. Although all classical algorithms can also be executed on quantum computers, the term quantum algorithm is often used for those algorithms that appear to be quantum in nature, or use some of the fundamental features of quantum computing, such as quantum superposition or quantum entanglement.
Problems that cannot be determined using a classical computer are still indeterminable using a quantum computer. What's interesting about quantum algorithms is that they may solve some problems faster than classical algorithms, because quantum algorithms exploit quantum superposition and quantum entanglement that may not be efficiently simulated on a classical computer (see quantum supremacy).
The best known algorithms are Shor's algorithm for factoring and Grover's algorithm for searching unstructured databases or unordered lists. Shor's algorithm runs much faster (almost exponentially) than the most famous classical factorization algorithm (general-purpose number field sieve). Grover's algorithm runs faster than the best classical algorithm for linear search for the same task.
- Shor's Factoring Algorithm
Shor's algorithm is a quantum algorithm for factoring a number N in O((log N)3) time and O(log N) space, named after Peter Shor.
The algorithm is significant because it implies that public key cryptography might be easily broken, given a sufficiently large quantum computer. RSA, for example, uses a public key N which is the product of two large prime numbers. One way to crack RSA encryption is by factoring N, but with classical algorithms, factoring becomes increasingly time-consuming as N grows large; more specifically, no classical algorithm is known that can factor in time O((log N)k) for any k. By contrast, Shor's algorithm can crack RSA in polynomial time. It has also been extended to attack many other public key cryptosystems.
Like all quantum computer algorithms, Shor's algorithm is probabilistic: it gives the correct answer with high probability, and the probability of failure can be decreased by repeating the algorithm.
Shor's algorithm was demonstrated in 2001 by a group at IBM, which factored 15 into 3 and 5, using a quantum computer with 7 qubits.
[More to come ...]