The Physics Journal

The Physics Journal

Quantum Computing: The Basics

Grover's Algorithm

Dr. Manhattan's avatar
Dr. Manhattan
Feb 15, 2025
∙ Paid
18
1
Share

The revelations of modern physics play a pivotal role in one of the most compelling innovations of our era: quantum computing. Basic physics concepts are already essential for the operation of classical computers from the newest iPhone to industrial supercomputers. Transistors control the flow of electric currents to process information in the form of bits, binary strings of 1’s and 0’s.

Quantum computers co-opt the quantum wave function to create qubits which can be in a superposition of 1 and 0 at the same time. This allows computations to be done in parallel, outperforming classical computers in specific problem domains.

Much has been said in the media about the quantum computer’s ability to crack common encryption techniques and facilitate Big Data analysis. But, the full implications of quantum computing have yet to be determined. As with any technology, its value will be proportional to the amount of creativity and vision which humanity can apply to it. Can we, in our time, utilize quantum computing to bore our way into the future or will it become just another vector of commodity capital?

To unravel this question we must first study the most well-established quantum algorithms. The math behind them is not extraordinarily complicated. However, because we are dealing with quantum states some familiarity with linear algebra is required. What is most important is that one comprehends how the principles of superposition are leveraged in all these quantum computations.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 The Physics Journal
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture