In contrast to classical computers that rely on binary bits and which can have only two states (zero or one), quantum computers rely on the ability of quantum bits, or qubits, to exist in a superposition of both states at the same time. This oddity of quantum physics allows quantum computers to perform certain calculations exponentially faster than even the most advanced supercomputers in existence today, with obvious use cases for LLMs and Generative AI that require fast and energy-efficient computing.
However, one of the main challenges that has always plagued quantum computing is the decay of qubits, leading to coherence losses and hence errors in the computed results. Several recent breakthroughs and discoveries are opening new horizons for this booming experimental field.
A team of Finnish researchers have theoretically and experimentally shown that the cause of the decay is linked to the thermal dissipation in the electrical circuit holding the qubit. Solving this major issue would extend the qubits’ coherence times, hence allowing for more operations and leading to even more complex calculations.
Until a solution is discovered to stop/slowdown the qubits’ decay, research teams are using another technique to achieve fault-tolerant quantum computing: Quantum Error Correction or QEC.
Listing all the 2024 QEC breakthroughs (some stemming from Microsoft and Google) is clearly out of scope in this report but what one should have in mind is that this notable advance will greatly help to trigger, in 2 or 3 years, the commercial success of quantum computing according to a recent Omdia survey of 28 quantum computer vendors.
QEC will allow to reach a fidelity level of 99.9% in qubits, allowing stable quantum computers capable of processing millions of reliable quantum operations (a 2028 target). This is obviously key to deliver large-scale practical quantum computing able to solve real-world applications.
To further accelerate quantum’s commercial rollout, the use of so-called hybrid algorithms, mixing traditional computing and quantum processes, are also considered. As quantum computing is only superior for very specific problems (necessitating new software code and algorithms), this blend significantly increases the use cases and hence augments its appeal to potential users.
Improving semiconductor techniques are also helping the quantum computing quest. Many research teams are looking for ways to fabricate qubits out of silicon to leverage proven and low-cost manufacturing techniques to produce them at scale. Here again, recent advances allowed Australian scientists to manufacture high-fidelity qubits using traditional foundry technology, paving the way for industrial-scale production of commercially viable quantum processors.
Finally, another specific research area, photonics, is also seeing some significant improvements linked to quantum computing. In this specific case, qubits are photons that can simultaneously have two states (two different wavelengths or colors of light). Here, the main challenge is to directly integrate optical components into a single (photonic) chip that could encode the behavior of light to perform calculations.
Even though these solutions are only at the prototyping stage, photonic-based quantum processing has already shown its superiority in terms of computing time and efficiency compared to its existing electronic-based counterparts. The holy grail of the photonic computer is slowly becoming a reality as a team of engineers recently developed a new method for photonic in-memory computing.
Unfortunately, every technological breakthrough has its drawbacks. In the case of quantum computing, the drawback is increased vulnerability of cybersecurity protections and IT systems. Recently, Chinese researchers were apparently able to breach encryption algorithms by using a D-Wave quantum computer. Feared, theorized and mediatized for many years, it is the first time that a quantum computer is posing a real and substantial threat to multiple full-scale SPN [Substitution-Permutation Network] structured algorithms in use today, according to Shanghai University scientists in a peer-reviewed paper.
Since the Chinese academic paper is missing many important factors (like the size of the crypto key for example), crypto experts around the world are still debating whether this “quantum hack” should be taken seriously or not.
Many companies and startups (IBM, Rigetti, IonQ, D-Wave, Xanadu…) are racing to build fault-tolerant quantum computers with the objective to offer their computing power through cloud services. Currently, picking and valuing clear winners appear as impossible tasks given that the technology and the revenue opportunity are very early stage, not to say years away.
Due to financing needs, most pure plays will then certainly either go bankrupt or, for those who have a strategic value, be taken over by deep pocketed heavyweights. IonQ appears to have the most traction in quantum space right now, with the company claiming almost $100 million in bookings and some high-profile customers such as AstraZeneca. But that’s clearly a high risk, high reward story.