Quantum Mechanics And Modern Day Computing

Quantum Mechanics and its influence on Modern Computing

Introduction to Quantum Mechanics

Quantum Mechanics, a fascinating field, may not have direct business implications, but it captivates many. This discipline delves into the interactions between various subatomic particles, such as Protons, Neutrons, Electrons, and a multitude of particles that constitute Protons and Electrons, like the Higgs Boson. Learn more about Quantum Mechanics

Without the knowledge of Quantum Mechanics, our current information technology equipment, especially the transistor, wouldn’t exist. This knowledge has further paved the way for the development of Quantum computers, a field still in its early stages.

Transition from Newtonian Physics to Quantum Mechanics

Our initial comprehension of physics was provided by Sir Isaac Newton. Until the early 20th century, we believed that matter, irrespective of its scale, would behave according to Newton’s laws of motion and gravity. However, we later discovered a disconnect between Newtonian physics and quantum mechanics.

Newtonian physics can predict with absolute certainty how matter responds to forces. But at the subatomic level, this all breaks down. For instance, you cannot pinpoint the location of an electron in an atom – this is the Heisenberg uncertainty principle. The act of measuring its position influences the outcome. At the subatomic level, everything boils down to probabilities, not absolutes.

The Uncertainty Principle and Its Impact

This uncertainty principle is inherent in the thought of possibly using the “Velocity” of a dev team to implement an incentive. The act of implementing a measurement scheme linked to Velocity, where the team sizes the tickets, would inadvertently impact the sizing of tickets and consequently the team velocity.

Physicists have been striving for the last 50 years or so to develop a unifying theory. This theory would explain Newtonian physics, gravity, and quantum mechanics. To date, there are several theories that mathematically unify these theories but nothing experimentally. Newton’s laws are easily tested through experiments, as are Einstein’s theories of relativity and the probabilistic models that describe quantum mechanics. The fact that computers work is proof enough. Richard Feynman, a Nobel Prize-winning physicist, once said that anyone claiming to fully understand Quantum Physics is either lying or delusional.

Quantum Computing: The Future of Computing

The final point to discuss in this context is Quantum computing and how it differs from transistor-based computing. Computers today function using transistors, which are just billions of on-off switches. They operate as binary logic gates with 0 or 1 being the binary options.

Quantum computers, on the other hand, use Qubits instead of binary bits. These Qubits take advantage of the uncertainty principle. The quantum state of an electron, as represented by its “spin”, can be anything between 0 and 1. The power of quantum computing lies in the essentially infinite states a Qubit can be in between 0 and 1.

Today’s computer chip with the highest number of transistors has 2.6 trillion transistors. In contrast, the most powerful quantum computer has only 66 Qubits. Some bold claims suggest that the latest Chinese quantum computing unit is 100 trillion times faster than the world’s fastest supercomputer, despite the small number of Qubits. Unfortunately, we cannot yet generalize that processing power in the way standard transistor-based computers can. We still have a way to go on this front before you need to start learning how to code for Quantum computers.