This week’s topic isn’t business-related directly and hopefully something some of you will find interesting. It is one of my areas of passion, quantum mechanics.
For those of you that do not know what Quantum Mechanics is, it is the study of the interaction between the various subatomic particles: Protons, Neutrons, Electrons, and the multitude of particles that comprise Protons and Electrons such as the Higgs Boson. Without knowledge of Quantum Mechanics, information technology equipment, in particular the transistor, as we know it wouldn’t exist. A further extension of this is the development of Quantum computers, which is still in its infancy.
Our initial understanding of physics was first described by Sir Isaac Newton. At the time and until the early 20th century, It was expected that matter, regardless of its scale, would behave according to Newton’s laws of motion and gravity. What became evident later is that there is a disconnect between Newtonian physics and quantum mechanics. Newtonian physics can predict with absolute certainty how matter responds to forces. At the subatomic level, this all breaks down. You cannot, for example, pinpoint the location of an electron in an atom – this is called the Heisenberg uncertainty principle. The act of measuring its position influences the outcome. At the subatomic level, everything boils down to probabilities, not absolutes. In some ways this uncertainty principle is inherent in the thought of possibly using the “Velocity” of a dev team to implement an incentive, in that, the act of implementing a measurement scheme linked to Velocity, where the team sizes the tickets, would inadvertently impact the sizing of tickets and consequently the team velocity.
Perhaps not even close to our measurement problem, described above, physicists have been hard at work for the last 50 years or so to try to develop a unifying theory that explains Newtonian physics, gravity, and quantum mechanics. To date, there are a number of theories that mathematically unify these theories but nothing experimentally. Newton’s laws are easily tested through experiments, as are Einstein’s theories of relativity and the probabilistic models that describe quantum mechanics – the very fact that computers actually work is proof enough. Richard Feynman, a Nobel Prize-winning physicist, was quoted as saying that if anyone claims to fully understand Quantum Physics, they are either lying or delusional.
The last thing in this context I will bring up is that of Quantum computing and how it differs from transistor-based computing as we know it. As you know computers today function using transistors which are just billions of on-off switches which operate as binary logic gates with 0 or 1 being the binary options. Quantum computers, on the other hand, don’t use binary bits but rather Qubits, which take advantage of the uncertainty principle described above which is that the quantum state of an electron as represented by its “spin” can be anything between 0 and 1. The power of quantum computing lies in the essentially infinite states a Qubit can be in between 0 and 1. By way of comparison, today’s computer chip with the highest number of transistors has 2,6 trillion transistors whereas the most powerful quantum computer has only 66 Qubits. There are some seriously bold quantum computing claims that the latest Chinese unit is 100trillion times faster than the world’s fastest supercomputer, despite the small number of Qubits. Unfortunately at the moment, that processing power cannot be generalized in the way standard transistor-based computers can. There is still a way to go on this front before you have to start learning how to code for Quantum computers.