Personal tools
You are here: Home Research Trends & Opportunities High Performance and Quantum Computing

High Performance and Quantum Computing

Black Holes Simulation_100820A
[Supermassive test: this simulation of the region around M87 shows the motion of plasma as it swirls around the black hole. The bright thin ring that can be seen in blue is the edge of the shadow. (Courtesy: L Medeiros/C Chan/D. Psaltis/F Özel/University of Arizona/Institute for Advanced Study) - Physicsworld]

 

 

- The Future of High Performance Computing

In the Age of Internet Computing, billions of people use the Internet every day. As a result, supercomputer sites and large data centers must provide high-performance computing services to huge numbers of Internet users concurrently. We have to upgrade data centers using fast servers, storage systems, and high-bandwidth networks. The purpose is to advance network-based computing and web services with the emerging new technologies.

The general computing trend is to leverage shared web resources and massive amounts of data over the Internet. The evolutionary trend towards parallel, distributed, and cloud computing with clusters, MPPS (Massively Parallel Processing), P2P (Peer-to-Peer) networks, grids, clouds, web services, the Internet of Things, and even quantum computing.

Data has become the driving force behind business, academic, and social progress, forcing significant advancements in computer processing. By 2025, an estimated 463 exabytes of data will be created each day globally. As institutions embrace a “data everywhere” mentality, high-performance computing (HPC) presents new opportunities to take on emerging challenges in these fields.

HPC arose as a discipline in computer science in which supercomputers are used to solve complex scientific problems. As HPC technologies grow in their computational power, other academic, government, and business institutions have adopted them to meet their own needs for fast computations. Today, HPC vastly reduces the time, hardware, and costs required to solve mathematical problems critical to core functions. Now an established field for advanced computing, HPC is driving new discoveries in astrophysics, genomics, and medicine, among other academic disciplines; it is driving business value in unlikely industries such as financial services and agriculture as well.

 

- Supercomputing Technology

Supercomputers play a significant and growing role in a variety of areas important to the nation. They are used to address challenging science and technology problems. "Supercomputer" is a general term for computing systems capable of sustaining high-performance computing applications that require a large number of processors, shared or distributed memory, and multiple disks. 

A supercomputer is a type of computer that has the architecture, resources and components to achieve massive computing power. Today's supercomputers consists of tens of thousands of the fastest processors that are able to perform billions and trillions of calculations or computations per second. 

Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). As of today, all of the world's fastest 500 supercomputers run Linux-based operating systems. 

Supercomputers are primarily are designed to be used in enterprises and organizations that require massive computing power. A supercomputer incorporates architectural and operational principles from parallel and grid processing, where a process is simultaneously executed on thousands of processors or is distributed among them.

Supercomputing technology has indelibly changed how we approach complex issues in our world, from weather forecasting and climate modeling to protecting of the security of our nation from cyberattacks. All of the world’s most capable supercomputers now run on Linux.

 

- Linux and Supercomputing

Linux dominates supercomputing. The Linux operating system runs all 500 of the world’s fastest supercomputers, which help to advance artificial intelligence, machine learning and even COVID-19 research. 

Although most modern supercomputers use the Linux operating system, each manufacturer has made its own specific changes to the Linux-derivative they use, and no industry standard exists, partly because the differences in hardware architectures require changes to optimize the operating system to each hardware design.

Why do super computers use Linux? Having looked at the expert view lets elaborate features of Linux that makes Linux the best choice for supercomputers:

  • Modular nature of Linux
  • Generic Nature of Linux Kernel
  • Scalability
  • Open Source Nature
  • Community Support
  • Cost

 

Alberta_Canada_102921A
[Alberta, Canada]

 - Quantum Primacy (or Quantum Supremacy)

Quantum computers are able to process information millions of times faster than classic computers. The quantum computing market is projected to reach $64.98 billion by 2030. Companies like Microsoft, Google, and Intel are racing to build quantum computing tools.

Today, our phones are millions of times more powerful than the computers that landed Apollo 11 on the moon. 

Technologists are now exploring the power of quantum computers that are 100 million times faster than any classical computer that will, in theory, be able to solve computation problems deemed impossible today. The appeal of quantum computers is the promise of helping to quickly answer questions so difficult that it would take decades for today's computers to solve.

A quantum computer is a remarkable device. While, at current, it's still limited in its application, we now know that it can be faster than the fastest computers we currently have access to. Quantum primacy (also known as quantum supremacy) is the point at which a quantum machine outstrips a classical computer. 

Computers have helped advance civilization and increased our ability to process data many times over. Even so, there are some problems that not even they can solve. The more answers we find, the more questions we have. Quantum computing was built to multiply computing power by tapping into what we know about quantum states. While a traditional computer is limited by bits, quantum computers aren't, allowing them to perform calculations many times faster. Or so it's assumed. 

There's still much debate as to whether we've gotten to the point of quantum primacy or not. Are quantum computers faster than regular computers, or aren't they?

 

- The Future of Quantum Computing

Quantum computing is essentially harnessing and exploiting the amazing laws of quantum mechanics to process information. A traditional computer uses long strings of “bits,” which encode either a zero or a one. A quantum computer, on the other hand, uses quantum bits, or qubits. What's the difference? Well a qubit is a quantum system that encodes the zero and the one into two distinguishable quantum states. But, because qubits behave quantumly, we can capitalize on the phenomena of "superposition" and "entanglement."  

Regular computers use bits to store information that only has two states: zero or one. Quantum computers, however, allow subatomic particles to exist in more than one state simultaneously so that they can exist as either a zero, a one, or both at the same time. Quantum bits, called "qubits," can thus handle a much vaster amount of information much faster than a normal computer. However, qubits need to be synchronised using a quantum effect known as entanglement, which Albert Einstein termed "spooky action at a distance.

Quantum computers are not meant to replace typical computers. In practice, they will be separate instruments used to solve complex, data-heavy problems, particularly those that make use of machine learning, where the system can make predictions and improve over time. 

There are four types of quantum computers currently being developed, which use:
  • Light particles
  • Trapped ions
  • Superconducting qubits
  • Nitrogen vacancy centres in diamonds

Quantum computers will enable a multitude of useful applications, such as being able to model many variations of a chemical reaction to discover new medications; developing new imaging technologies for healthcare to better detect problems in the body; or to speed up how we design batteries, new materials and flexible electronics."

 

 

[More to come ...]

 

 

Document Actions