The Power of Quantum Computing

(Image Credit: engadget)

(Image Credit: PREFETCH)

(Image Credit: Quantum Insider)

October 21, 2024

Rachel Truong 

12th Grade

Fountain Valley High School



What is Quantum Computing?


In a world where classical computing would take billions of years to solve a seemingly impossible problem, quantum computing has the capability to do it in seconds. Quantum computing applies properties of quantum mechanics to exponentially facilitate computer processes. Since quantum computing does not adhere to classical physics, it also does not rely on binary logic. It uses qubits, which have exponentially more capabilities than regular bits. Qubits are the fundamental units of quantum information used to encode data similar to the binary number system used in classical computing.  They are typically created by manipulating quantum particles such as photons, electrons, superconductors, etc. While a bit in the binary system can only represent 0 or 1 at once, a qubit has the ability to represent numerous combinations of 0 and 1 at the same time. This phenomenon is called quantum superposition as the qubit exists in multiple states at once, allowing it to hold a greater magnitude of quantum parallelism. In this way, quantum computing could enable high-tech innovations that would have been impossible with classical computing.



Foundations of Quantum Computing


In 1900, Max Planck introduced the quantization of energy, which laid the framework for the quantum nature of energy. In 1905, Albert Einstein explained the photoelectric effect by demonstrating the wave-particle duality of light and how energy is transferred in photons in discrete packets of energy called quanta. This supported the idea that light consists of individual quantum particles, a fundamental concept of quantum mechanics. In 1926, Erwin Schrodinger’s wave equation explained how quantum systems can exist in multiple states (superposition) and described how quantum particles are entangled, so that changing the state of one particle affects the other (entanglement). In 1982, Richard Feynman first proposed the concept of quantum computing, indicating that quantum mechanics could be more effectively simulated on quantum computers rather than classical computers, which operated based on laws of classical physics. Then, in 1998, Issac Chuang, Neil Gershenfeld, and Mark Kubinec created one of the earliest prototypes of a quantum computer. This work demonstrated the ability to execute a quantum algorithm on a quantum-based system, becoming a milestone in the development of quantum computing. Principles such as the quantization of energy, quantum superposition, and entanglement were established by many scientists who made significant contributions to the development of quantum computing. 



Potential and Limitations of Quantum Computers


Significant advances have been made in recent years from these foundational discoveries, particularly Peter Shor’s algorithm, which could factor large numbers into prime numbers. This marked a crucial point for quantum computers as factoring large numbers was an incredibly challenging task for classical computers, especially with current encryption algorithms such as RSA. In addition, quantum computing leverages principles such as superposition and entanglement to perform various tasks from simulating particles on an atomic scale to decoding encryption methods. With the capacity to model and simulate natural systems thoroughly, quantum computational power gaps classical computing.


Quantum computing possesses the capability to transform fields such as material sciences and pharmacy, where quantum simulation can model molecules and materials faster than with classical computing. Quantum machine learning further enhances this by processing data from electronic, magnetic, and quantum sensors more efficiently than classical methods.


However, there are many crucial challenges and limitations that need to be addressed. Quantum computers require high fidelity and scalability, demanding upgrades in computational power to handle complex processes efficiently. Control over individual qubits remains challenging as it necessitates a sizable cooling system and an isolated environment from outside forces like particles and magnetic fields. Because of that, their fragility and susceptibility lessens the reliability and stability of quantum computers. Despite the major progress and momentum in addressing these limitations, full-scale fault tolerance is critical for the technology to become effective and accessible across industries. 



What This Means for the Future


Major developers such as IBM and Google have made notable breakthroughs in quantum computing. IBM’s Qiskit Functions Catalog simplifies quantum programming, facilitating the development of quantum algorithms. Additionally, Google’s Quantum AI has made significant progress in error correction and scaling up quantum processors, advancing their goal of quantum supremacy. According to Sundar Pichai, CEO of Google, “our quantum machine successfully performed a test computation in just 200 seconds that would have taken the best known algorithms in the most powerful supercomputers thousands of years to accomplish,” illustrating their strides toward a fault-tolerant quantum computer. These developments could introduce new methods and improvements in cryptography, data analysis, predictions, pattern research, and drug research. As quantum computing technology continues to evolve, overcoming these challenges is essential for realizing its full potential and integration into commercial applications.

Reference Sources

arxiv.org. “Feynman’s “Simulating Physics with Computers.”” Arxiv.org, 2022, 

https://arxiv.org/html/2405.03366v1. Accessed 17 Oct. 2024.

Bobier, Jean-François, et al. “The Long-Term Forecast for Quantum Computing Still Looks Bright.” BCG Global, 18 July 2024,

www.bcg.com/publications/2024/long-term-forecast-for-quantum-computing-still-looks-bright

Bub, Jeffrey. “Quantum Entanglement and Information.” Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford

University, 2019, 

https://plato.stanford.edu/entries/qt-entangle/

ecanorea. “Quantum Computing: Potential and Challenges Ahead - Plain Concepts.” Plain Concepts, 19 June 2024,

https://www.plainconcepts.com/quantum-computing-potential-challenges/#Getting_into_Quantum_Computing. Accessed 17 Oct. 2024.

Energy.gov. “Creating the Heart of a Quantum Computer: Developing Qubits.” Energy.gov, 10 Feb. 2020,

www.energy.gov/science/articles/creating-heart-quantum-computer-developing-qubits

Forbes. “Quantum Computers Could Change Everything - Here’s What You Should Know in under 4 Minutes | Forbes.” Www.youtube.com, 22

Dec. 2021, 

www.youtube.com/watch?v=LEslWkeY1tk. Accessed 4 Jan. 2022.

Giles, Martin. “Explainer: What Is a Quantum Computer?” MIT Technology Review, 29 Jan. 2019,

www.technologyreview.com/2019/01/29/66141/what-is-quantum-computing/

Holton, William Coffeen . “Quantum Computer | Computer Science.” Encyclopedia Britannica, 10 Oct. 2024,

www.britannica.com/technology/quantum-computer

Hosgood, Timothy. “10.11 Shor’s Algorithm | Introduction to Quantum Information Science.” Qubit.guide, 2024, 

https://qubit.guide/10.11-shors-algorithm. Accessed 17 Oct. 2024.

Lardinois, Frederic. “IBM Makes Developing for Quantum Computers Easier with the Qiskit Functions Catalog | TechCrunch.” TechCrunch,

16 Sept. 2024, 

https://techcrunch.com/2024/09/16/ibm-makes-developing-for-quantum-computers-easier-with-the-qiskit-functions-catalog/. Accessed 17 Oct. 2024.

Mount, Emily. “2021 Year in Review: Google Quantum AI.” Google, 30 Dec. 2021, 

https://blog.google/technology/research/2021-year-review-google-quantum-ai/

Pichai, Sundar. “What Our Quantum Computing Milestone Means.” Google, 23 Oct. 2019, 

https://blog.google/technology/ai/what-our-quantum-computing-milestone-means/

Press, Gil. “27 Milestones in the History of Quantum Computing.” Forbes, 18 May 2021, 

www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/

Schneider, Josh, and Ian Smalley. “What Is a Qubit? | IBM.” Www.ibm.com, 1 Mar. 2024, 

www.ibm.com/topics/qubit

---. “What Is Quantum Computing?” IBM, 5 Aug. 2024, 

www.ibm.com/topics/quantum-computing

ScienceDirect. “Quantum Parallelism - an Overview | ScienceDirect Topics.” Www.sciencedirect.com,

www.sciencedirect.com/topics/engineering/quantum-parallelism

T. Q. I. Admin. “The History of Quantum Computing You Need to Know [2022].” The Quantum Insider, 26 May 2020,

https://thequantuminsider.com/2020/05/26/history-of-quantum-computing/

Tavares, Frank. “What Is Quantum Computing? - NASA.” NASA, 6 July 2022, 

www.nasa.gov/technology/computing/what-is-quantum-computing/

TED. “Quantum Computers Aren’t What You Think — They’re Cooler | Hartmut Neven | TED.” YouTube, 19 July 2024,

https://www.youtube.com/watch?v=UtDllX_MTbw. Accessed 17 Oct. 2024.