Four days ago, Time Magazine appeared to promote quantum computing on its cover in its mid-February 2023 issue (February 13-20, 2023). A front-page feature in Time Magazine holds significance as it highlights important topics and captures reader attention. The cover serves as a powerful tool in shaping perceptions and raising awareness among Time's extensive and influential readership. This article is part of the "Time 2030" initiative, a ten-year project tracking developments across six key areas: innovation, equality, sustainability, economy, wellbeing, and leadership. Quantum computing falls under the "innovation" pillar.
Picture from: Source
Four years ago, at a Google Talk, Shoucheng Zhang predicted that the future of AI and blockchain would be a symbiotic relationship, with AI requiring data, and blockchain and crypto-economics creating a data marketplace. He believed that crypto-economics, by valuing data based on mutual entropy, could counter socio-economic biases and lead to a fairer data marketplace. Zhang also stated that quantum computing would play a role in this relationship, consuming less energy and enhancing AI (this can be a part of Industrial 6.0). He emphasized the importance of a decentralized data marketplace over a centralized one, as the latter concentrates data ownership and control with one entity, while a decentralized blockchain-based marketplace allows individuals to own and control their data while preserving privacy through cryptography and secure multi-party computation. Zhang predicted that the trend towards decentralized technology would continue to grow, and that crypto-economics could become a more precise foundation for economic science. Despite challenges such as lower user-friendliness and adoption compared to centralized platforms, he believed that as decentralized technology evolves and becomes more accessible, it would become more widely adopted and compete with centralized platforms.
|QAB⟩, the Changing of Era and Geopolitical Competition
The global AI market size is estimated to reach USD 1.5 Trillion by 2030, while the global blockchain technology market size is projected to be worth USD 69 Billion by 2030 at a CAGR of 68%. The quantum computer market size, meanwhile, is estimated to hit USD 18.16 Billion by 2030 with a CAGR of 34.3%. However, what is of utmost importance is Professor Zhang's integrated concept of these three technologies, in which we informally call it as the |QAB⟩ technology (read "Kab-Ket").
Professor Zhang was not just a physicist researcher, but also the former head of Digital Horizon Capital (formerly Danhua Capital), which used capital from Chinese state-owned companies to invest in 113 US firms, many of which specialized in disruptive sectors such as biotechnology and AI. China's heavy investment in its technology sector and its competition with the US for advanced technology development has led to a dilemma for US-based Chinese citizens like Zhang, who was caught between his loyalty to China and to his adopted homeland.
Zhang, who had committed suicide, was reportedly struggling with guilt and depression due to US investigations into China's technology theft. The US government's growing caution about China's acquisition of advanced technologies has led to restrictions on US scientists' involvement in foreign recruitment programs, the Thousand Talents Plan. China's growing innovation capacity and its resort to illegal methods to enhance it have made its investment in Danhua part of its strategy to acquire advanced technologies and improve its innovation capabilities. Zhang played a role in China's acquisition of these technologies through his participation in China's recruitment programs and transfer of knowledge.
Blockchain has seen widespread use in cryptocurrency and non-fungible tokens (NFTs) in recent years, while AI seems poised for a breakthrough this year. However, quantum computing has yet to reach its full potential due to issues with noisy qubits, resulting in what is known as noisy intermediate-scale quantum (NISQ). According to the World Economic Forum, public and private investments in quantum technologies amounted to USD 35.5 billion by 2022, and as previously noted by Time Magazine, this article will explore the introduction of quantum mechanics, quantum computing, and its applications in detail.
The Beginnings: Quantum Mechanics
Max Planck's paper, "On the Theory of the Law of Energy Distribution in Normal Spectrum," introduced the idea that energy is not continuous but instead is emitted and absorbed in discrete units, known as "quanta." This concept helped to explain phenomena such as black-body radiation and the photoelectric effect, which classical physics could not. Planck's work is considered a cornerstone of quantum physics. Initially skeptical of his theory, Planck was quoted as saying, "I was ready to sacrifice any of my previous convictions about physics." However, he believed it was the only way to preserve the two laws of thermodynamics and reluctantly accepted it. Planck's theory challenged classical physics and marked a major turning point in the development of quantum mechanics. Despite his initial discomfort, he continued to make significant contributions to the field and was awarded the Nobel Prize in Physics in 1918 for his work. Albert Einstein built upon Planck's ideas, developing the concept of photoelectrons and explaining the photoelectric effect. Einstein proposed that light consists of individual photons with specific amounts of energy proportional to their frequency. When a photon with enough energy strikes a material, it can dislodge an electron, with the emitted electron's energy equal to the photon's energy minus the electron's binding energy to the material. This idea provided experimental support for quantized energy and further established the foundations of quantum physics.
Werner Heisenberg introduced matrix mechanics as a mathematical approach to describe quantum mechanics. He demonstrated that a particle's position and velocity cannot be determined simultaneously, a principle referred to as the uncertainty principle. Matrix mechanics precisely calculated the probabilities of potential quantum mechanical measurement outcomes.
Erwin Schrödinger and Max Born later expanded and generalized Heisenberg's work, demonstrating that wave functions could describe the wave-particle duality of matter. Pascual Jordan and Max Born further developed the theory, using wave functions to determine outcome probabilities.
Paul Dirac later devised a more comprehensive and mathematically sophisticated formulation of quantum mechanics, known as quantum field theory, which blended quantum mechanics with special relativity and explained particle interactions with electromagnetic fields. These advancements solidified quantum mechanics as a fundamental theory of nature.
iℏ ∂Ψ/∂t = HΨ
The equation representing the evolution of the wave function, Ψ, over time, t, or Ψ(x,t), in quantum mechanics demonstrates the likelihood amplitude of detecting a particle at a specific position, x, given an initial time. It takes into account the combined energy of the system, represented by the Hamiltonian operator, H, and the imaginary unit, i, following the equation i^2 = -1. The wave function can be determined using numerical or analytical solutions of the Schrödinger equation, with ℏ being the reduced Planck constant. This wave function can then be utilized to calculate various quantum properties such as the probability of particle presence at different positions, the particle's energy levels, and the system's time development.
The Standard Model
The Schrödinger equation, when applied to the structure of atoms, provides a theoretical basis for comprehending electron distribution around the nucleus and energy levels of electrons in atoms. By solving the equation for a specific atom, physicists can forecast the likelihood of an electron occupying a particular space, leading to an understanding of the atom's electron configuration. This information is vital for explaining the behavior of atoms in chemical reactions and predicting the properties of molecules and materials.
Later, examination of the interior of protons and neutrons led to the discovery of quarks, the fundamental building blocks of matter. This was a landmark in particle physics and resulted in the development of the standard model, which portrays quarks as the building blocks of protons and neutrons, the basic components of atomic nuclei.
Inside atom and fundamental particles: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 84
The standard model explains interactions between quarks, leptons, and force-carrying particles such as photons and gluons and provides a theoretical foundation for understanding the behavior of matter and energy at the microscopic level. It is based on quantum mechanics and special relativity and has been confirmed by multiple experiments, becoming one of the most successful and established theories in physics. The standard model offers a profound understanding of the fundamental nature of matter and energy. Knowledge in quantum mechanics and the standard model led to application of “supersolidity state”
Gluons are gauge bosons that mediate the strong nuclear force between quarks, holding the protons and neutrons inside atomic nuclei. These play a crucial role in maintaining nucleus stability and are considered fundamental building blocks of matter. Gluons also bind quarks to form more massive particles, such as protons and neutrons. On the other hand, Chronons are hypothetical particles proposed as a theoretical explanation for time quantization. The notion is that time is made up of indivisible units called chronons, similar to Planck lengths as indivisible units of space. However, this idea lacks experimental evidence and is considered speculative. Despite this, interest among physicists remains high, and research into chronons continues.
Superposition & Entanglement
Superposition is a fundamental principle in quantum mechanics, often misinterpreted as quantum objects existing in multiple states or locations at once. However, this is not accurate according to standard interpretations of quantum mechanics. Superposition is actually a mathematical result of the wave-particle duality and quantum principles, and arises because a combination of solutions to the Schrödinger equation is also a solution. The principle of superposition accounts for the interference patterns seen in experiments like the double-slit experiment. Quantum objects have a single, predictable state described by a probability distribution of observables, which can lead to different results when measuring properties. The Copenhagen interpretation posits that one should not try to physically explain superposition before measurement. Some view superposition in classical physics as rapid changes in quantum state, but this interpretation is considered incorrect by experts. Nevertheless, it is a useful way to visualize superposition in the physical world.
Superposition State: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 102)
While entanglement is a state of two quantum objects where they have a correlated state, regardless of the distance between them. These objects behave like a single entity: changing one affects the other, and measuring one results in correlated measurements of the other. This correlation can be verified using repeated tests, such as a Bell test, on a system prepared similarly each time. From a mathematical perspective, entanglement is represented by the tensor product of two subsystems, which is a large vector or matrix: 𝐻ab = 𝐻a ⨂ 𝐻b.
Quantum Computer and Qubits
At present, building a physical quantum Turing machine is not possible due to the current limitations of quantum technology and our understanding of quantum computing. However, a full quantum computer system, which is essentially a quantum gate array, is achievable by combining unconventional components such as the qubits circuit and chandelier with the conventional computing component, known as a Turing machine, which operates the system. It's expected to beat classical computer especially in Bounded-error Quantum Polynomial time (BQP) problem. The idea of a quantum computer was first introduced by physicist Richard Feynman in a 1982 seminal paper. He proposed using quantum mechanics to build a computer that could solve certain problems much faster than classical computers, due to its ability to simulate the behavior of quantum systems. This sparked further development by researchers, leading to the first experimental demonstrations and the creation of practical quantum algorithms. Today, quantum computing is a rapidly growing field with a large and active research community, and quantum computers are becoming increasingly accessible for commercial and research use.
Magnitude of Qubits vs Classical Computer's Bits and Processing Time in Classical Computer: Source
Quantum computers are based on a new type of bit, the qubit, which differs from classical bits by allowing superposition of 0 and 1 at the same time. This gives quantum computers an advantage in speed for certain computations and enables quantum algorithms to perform tasks beyond classical computers, due to the entanglement of qubits where the state of one is correlated with the state of another.
Quantum computing leverages quantum superpositions of binary states (qubits) for exponential speedup in some computations. This quantum parallelism, where all possible states of multiple bits can be input simultaneously, allows quantum algorithms to outperform classical algorithms for specific problems such as Grover's algorithm for database search and Shor's algorithm for factoring large numbers. A comprehensive list of quantum algorithms can be found on the "Quantum Algorithm Zoo" website.
The Bloch Sphere
The Bloch sphere represents the state of a two-level quantum system, such as a qubit, through a graphical representation. The sphere's north pole corresponds to the state |0⟩ and the south pole corresponds to the state |1⟩, which are the two orthogonal states of the qubit. The sphere's surface represents any other state of the qubit.
The Bloch sphere enables visualizing the properties of quantum states and operations performed on them, making it easier to understand the impact of quantum operations on the qubit's state. The sphere also plays a crucial role in quantum error correction, cryptography, and computing by identifying sources of errors in quantum systems and designing strategies to correct them.
The Qubit Bloch Sphere: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 102
A qubit vector state is described by two complex numbers α and β, represented by the formula |Ψ⟩ = α|0⟩ + β|1⟩, where |0⟩and |1⟩ are the basis states. The coefficients α and β are complex numbers that determine the probability of having the state |0⟩ or |1⟩. The sum of their squares must equal 1.
The mathematical representation of a qubit state is based on complex numbers and the Bloch sphere. The qubit state is a two-dimensional vector of length 1, whose angles correspond to its amplitude and phase. The basis states |0⟩ and |1⟩ are represented by vectors pointing to the north and south poles, respectively, and are mathematically orthogonal. When θ equals π/2, corresponding to a half-turn on the sphere, moving from |0⟩ to |1⟩, cos(θ/2) = 0, illustrating the orthogonality of the two states.
Quantum Gates
Traditional microprocessors are composed of fixed logic gates that are etched into silicon and "moving" bits that are electrical pulses that propagate through the circuit via these gates. All of this operates at a fixed frequency, often in GHz, set by a quartz clock.
Classical Gates vs Quantum Gates: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 181
In a quantum computer, the first step in processing is resetting the quantum register to an initial state, known as "preparing the system." Registers are first physically configured in the |0⟩ state. The subsequent initialization involves using different operators, such as the Hadamard transformation to create a |0⟩+|1⟩ superposition, or the X gate to change the value from |0⟩ to |1⟩. Sometimes, more preparation is required, such as with quantum machine learning algorithms, to prepare a more complex register state. Once this initialization is complete, computing gates operations are sequentially applied to the qubits according to the algorithm being executed.
Qubit’s life cycle: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 184
Type of Quantum Computer
In 1998, a major milestone in the development of quantum computing was reached when a 2-qubit NMR quantum computer was used to successfully solve Deutsch's problem at both Oxford University and IBM's Almaden Research Center. This marked the start of the race to build the first practical quantum computer. However, it wasn't until 2011 that a significant step was taken towards achieving this goal with the release of IBM's 5-qubit quantum computer. Accessible to the public through the cloud, this quantum computer demonstrated the feasibility of building a large-scale, practical, and functional quantum computer that could be scaled, maintained stability, and had the reliability required for real-world use, which earlier prototypes lacked. This was a key factor in furthering the development and interest in the field of quantum computing. The status of the quantum annealing type as a true quantum computer is the subject of some debate. On the other hand, gate-based quantum computers are widely recognized as the standard model for quantum computing. These quantum computers can be built using a variety of physical systems, each with its own unique strengths and weaknesses. Some of the most common types of gate-based quantum computers include:
Superconducting quantum computers: These use superconducting circuits to store and manipulate qubits. They are often considered the most advanced type of quantum computer and are commercially available from several companies. These computers use low temperatures and the properties of superconductors to maintain the quantum state of the qubits, allowing for highly accurate computations.
Electrons-on-helium defects in solid: This type of quantum computer uses electrons trapped in defects in a solid-state material, such as diamond, to store and manipulate qubits. The electrons are isolated from their environment, allowing them to maintain their quantum state for longer periods of time compared to other types of quantum computer systems. However, this technology is still in the early stages of development and is not yet available commercially.
Quantum dots: These use tiny semiconductor structures, known as quantum dots, to store and manipulate qubits. These systems are highly scalable and can be integrated with existing semiconductor manufacturing processes, which could make them a promising platform for large-scale quantum computing.
Trap atoms: These use lasers to trap and manipulate individual atoms, which serve as qubits. These systems offer high coherence times and the ability to perform highly accurate quantum computations. However, they are often considered to be difficult to scale to large numbers of qubits due to the challenges associated with trapping and manipulating individual atoms.
Trapped ions: These use electric fields to trap and manipulate ions, which serve as qubits. These systems also offer high coherence times and accurate computations, but are also difficult to scale to large numbers of qubits.
Topological qubits: These use the properties of topological materials, such as anyons, to store and manipulate qubits. These systems offer long coherence times and resistance to errors, but are still in the early stages of development and are not yet available commercially.
Photonic quantum computers: These use light to store and manipulate qubits. Photonic quantum computers have the advantage of being highly scalable and resistant to noise, but they are still in the early stages of development. Photonic quantum computers have the potential to perform large-scale quantum computations, but the challenge of developing the necessary hardware and software remains.
Type of Quantum Computer compared by Gate Speed vs Gate Fidelity: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 211
Layout of quantum computer: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 214
Example of 8 qubits processor layout: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 214
Contemporary Quantum Computer providers: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 273
Comparative performances among quantum computer: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 274
IBM’s Quantum Computer Development Roadmap: Understanding Quantum Technologies 2022 - Quantum physics 101, p. 315
Market Value and Application
The prediction of the quantum computing market size is a highly uncertain matter. Several market research firms have made their own projections, with significant variance in the figures presented. For instance, Markets and Markets predicted a market size of USD 553 million in 2023, while Hyperion Research forecasted USD 830 million in 2024. Meanwhile, Market Research Future estimated the market size to reach USD 2.64 billion in 2022 and CIR predicted it to be USD 1.9 billion in 2023 for example.
The projections for the quantum computing market continue to vary widely. For example, P&S Intelligence predicted a market size of USD 64 billion by 2030, while ResearchAndMarkets projected that the global quantum technology market would reach USD31.57 billion by 2026, with USD 14.25 billion attributed to quantum computing. Meanwhile, The Quantum Insider estimated a total quantum computing market size of between USD300 million and USD1.3 billion in 2021, with a compound annual growth rate (CAGR) of between 70% and 80% from 2021 to 2025.
It is worth noting that the rapid pace of technological development in the field of quantum computing makes these projections highly susceptible to change. Nevertheless, the general consensus among market research firms is that the quantum computing market is expected to experience significant growth in the coming years.
One prime example is the pharmaceutical industry which quantum computers hold immense promise in revolutionizing drug discovery and research. Their application in this field is multi-faceted, including drug design, protein folding, drug screening, and machine learning.
For example, quantum computers can perform high-precision simulations of molecular interactions, offering insights into the interactions between drugs and biological molecules. This information can inform the design of new drugs with better efficacy and fewer side effects. In addition, quantum computers can be used to simulate protein folding, providing a deeper understanding of the underlying mechanisms and identifying new drug targets.
Furthermore, quantum computers can perform high-throughput virtual screening of chemical databases to identify new drugs and predict their interactions with biological targets. They can also be applied in machine learning tasks such as classification and regression to aid in identifying new drug targets and predicting the efficacy of new drugs. Finally, quantum computers can optimize complex molecular simulations, improving the efficiency of drug discovery.
Correction: (1) We have incorporated additional relevant information into the article and streamlined some cumbersome sections, but the overall structure of the writing remains unchanged.
(2) Add url about the origins of the Schrödinger equation from Cantor's Paradise (February, 5, 2023). (3) Clarify the definition of quantum computer system as: At present, building a physical quantum Turing machine is not possible due to the current limitations of quantum technology and our understanding of quantum computing. However, a full quantum computer system, which is essentially a quantum gate array, is achievable by combining unconventional components such as the qubits circuit and chandelier with the conventional computing component, known as a Turing machine, which operates the system. (February, 6, 2023). (4) Add the significant milestone in the development of physical quantum computer: In 1998, a major milestone in the development of quantum computing was reached when a 2-qubit NMR quantum computer was used to successfully solve Deutsch's problem at both Oxford University and IBM's Almaden Research Center. This marked the start of the race to build the first practical quantum computer. However, it wasn't until 2011 that a significant step was taken towards achieving this goal with the release of IBM's 5-qubit quantum computer. Accessible to the public through the cloud, this quantum computer demonstrated the feasibility of building a large-scale, practical, and functional quantum computer that could be scaled, maintained stability, and had the reliability required for real-world use, which earlier prototypes lacked. This was a key factor in furthering the development and interest in the field of quantum computing. (February, 6, 2023).
(5) Add Einstein's 1905 paper on the Photoelectric Effect from Cantor Paradise. (February, 22, 2023).
(6) It should be noted about the recent financial difficulties from quantum computer enterprise such as D-Wave. However, we decide not to include in this introductory article. (February, 22, 2023).
(7) There should be references to government policy on quantum computing, such as the US national quantum initiative, Canada's national quantum strategy, and EU quantum flagship program. However, we decide not to include them in this introductory article. (February, 22, 2023).
Comments