Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

What is Quantum Advantage? A Quantum Computing Scientist Explains an Approaching Milestone

By: TQI Admin
20 November 2023 at 10:22
Quantum computer

Insider Brief

  • The Conversation article explores quantum advantage and why it’s important to quantum researchers and the quantum industry.
  • Quantum advantage refers to solving types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithm.
  • The article was written by Daniel Lidar, Professor of Electrical Engineering, Chemistry, and Physics & Astronomy, University of Southern California.
  • Image: A prototype quantum sensor developed by MIT researchers can detect any frequency of electromagnetic waves. Guoqing WangCC BY-NC-ND

THE CONVERSATION — Quantum advantage is the milestone the field of quantum computing is fervently working toward, where a quantum computer can solve problems that are beyond the reach of the most powerful non-quantum, or classical, computers.

Quantum refers to the scale of atoms and molecules where the laws of physics as we experience them break down and a different, counterintuitive set of laws apply. Quantum computers take advantage of these strange behaviors to solve problems.

There are some types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithms. Research in recent decades has shown that quantum computers have the potential to solve some of these problems. If a quantum computer can be built that actually does solve one of these problems, it will have demonstrated quantum advantage.

I am a physicist who studies quantum information processing and the control of quantum systems. I believe that this frontier of scientific and technological innovation not only promises groundbreaking advances in computation but also represents a broader surge in quantum technology, including significant advancements in quantum cryptography and quantum sensing.

The source of quantum computing’s power

Central to quantum computing is the quantum bit, or qubit. Unlike classical bits, which can only be in states of 0 or 1, a qubit can be in any state that is some combination of 0 and 1. This state of neither just 1 or just 0 is known as a quantum superposition. With every additional qubit, the number of states that can be represented by the qubits doubles.

This property is often mistaken for the source of the power of quantum computing. Instead, it comes down to an intricate interplay of superposition, interference and entanglement.

Interference involves manipulating qubits so that their states combine constructively during computations to amplify correct solutions and destructively to suppress the wrong answers. Constructive interference is what happens when the peaks of two waves – like sound waves or ocean waves – combine to create a higher peak. Destructive interference is what happens when a wave peak and a wave trough combine and cancel each other out. Quantum algorithms, which are few and difficult to devise, set up a sequence of interference patterns that yield the correct answer to a problem.

Entanglement establishes a uniquely quantum correlation between qubits: The state of one cannot be described independently of the others, no matter how far apart the qubits are. This is what Albert Einstein famously dismissed as “spooky action at a distance.” Entanglement’s collective behavior, orchestrated through a quantum computer, enables computational speed-ups that are beyond the reach of classical computers.

Applications of quantum computing

Quantum computing has a range of potential uses where it can outperform classical computers. In cryptography, quantum computers pose both an opportunity and a challenge. Most famously, they have the potential to decipher current encryption algorithms, such as the widely used RSA scheme.

One consequence of this is that today’s encryption protocols need to be reengineered to be resistant to future quantum attacks. This recognition has led to the burgeoning field of post-quantum cryptography. After a long process, the National Institute of Standards and Technology recently selected four quantum-resistant algorithms and has begun the process of readying them so that organizations around the world can use them in their encryption technology.

In addition, quantum computing can dramatically speed up quantum simulation: the ability to predict the outcome of experiments operating in the quantum realm. Famed physicist Richard Feynman envisioned this possibility more than 40 years ago. Quantum simulation offers the potential for considerable advancements in chemistry and materials science, aiding in areas such as the intricate modeling of molecular structures for drug discovery and enabling the discovery or creation of materials with novel properties.

Another use of quantum information technology is quantum sensing: detecting and measuring physical properties like electromagnetic energy, gravity, pressure and temperature with greater sensitivity and precision than non-quantum instruments. Quantum sensing has myriad applications in fields such as environmental monitoringgeological explorationmedical imaging and surveillance.

Initiatives such as the development of a quantum internet that interconnects quantum computers are crucial steps toward bridging the quantum and classical computing worlds. This network could be secured using quantum cryptographic protocols such as quantum key distribution, which enables ultra-secure communication channels that are protected against computational attacks – including those using quantum computers.

Despite a growing application suite for quantum computing, developing new algorithms that make full use of the quantum advantage – in particular in machine learning – remains a critical area of ongoing research.

Staying coherent and overcoming errors

The quantum computing field faces significant hurdles in hardware and software development. Quantum computers are highly sensitive to any unintentional interactions with their environments. This leads to the phenomenon of decoherence, where qubits rapidly degrade to the 0 or 1 states of classical bits.

Building large-scale quantum computing systems capable of delivering on the promise of quantum speed-ups requires overcoming decoherence. The key is developing effective methods of suppressing and correcting quantum errors, an area my own research is focused on.

In navigating these challenges, numerous quantum hardware and software startups have emerged alongside well-established technology industry players like Google and IBM. This industry interest, combined with significant investment from governments worldwide, underscores a collective recognition of quantum technology’s transformative potential. These initiatives foster a rich ecosystem where academia and industry collaborate, accelerating progress in the field.

Quantum advantage coming into view

Quantum computing may one day be as disruptive as the arrival of generative AI. Currently, the development of quantum computing technology is at a crucial juncture. On the one hand, the field has already shown early signs of having achieved a narrowly specialized quantum advantage. Researchers at Google and later a team of researchers in China demonstrated quantum advantage for generating a list of random numbers with certain properties. My research team demonstrated a quantum speed-up for a random number guessing game.

On the other hand, there is a tangible risk of entering a “quantum winter,” a period of reduced investment if practical results fail to materialize in the near term.

While the technology industry is working to deliver quantum advantage in products and services in the near term, academic research remains focused on investigating the fundamental principles underpinning this new science and technology. This ongoing basic research, fueled by enthusiastic cadres of new and bright students of the type I encounter almost every day, ensures that the field will continue to progress.

From: The Conversation

Quantum Matters: Predictions For The Next 12 Months

By: TQI Admin
8 November 2023 at 12:10
The City Quantum

This is an edited version of opening remarks to The City Quantum Summit 2023.

By Karina Robinson

What’s happened in the last year in quantum? Three things.

Money’s been tight; governments have become ever more involved; the science and preparation for a post-quantum world have advanced at pace.

Just in the last weeks, the GSMA Post Quantum Telco Network Task Force published a call to action in its “Guidelines for Quantum Risk Management for Telecom companies – practical advice on security for a quantum world.”

Aircraft manufacturer Boeing pulled off the world’s first flight test of multiple types of quantum sensors.

Quantum software firm Multiverse Computing launched Compactify AI, an AI model that compresses gargantuan LLMs while maintaining the accuracy of results.

What three predictions could one have for the next 12 months?

Firstly, funding constraints should ease. We’re already seeing green shoots in the financial markets. Rumour has it that two big fundraisings from hardware companies will be announced soon, with Softbank, Microsoft, and a Sovereign Wealth Fund among the investors.

Secondly, the quantum industry needs to tie itself to the AI explosion. This phrase from a JP Morgan research note says it all: “Unified data assembled neatly in a modern database hosted in the cloud is optimal for fine-tuning LLMs, but most companies’ data is anything but unified on anything but modern database infrastructure.”

AI is using up ever-increasing, vast amounts of computing power, resulting in a huge carbon footprint, while the data sets are too messy, too large, and too expensive. Although not straightforward, quantum can help clean up the data and come up with accurate results using a smaller data base.

Any nation that falls behind in quantum, falls behind in AI – we as a sector need to make that point to governments and investors.

Thirdly, geopolitics. Export restrictions on Deep Tech are complex. Finding the balance between supporting cross-border innovation and ensuring the security of the West…well, not even a million qubits will be able to deal with the contradictions!

But what is clear is that the role of emerging and disruptive technology in defence is crucial. We have a golden opportunity – sadly based on a bedrock of invasion and war – to drive dual use technology. Decades ago, the creation of US agency DARPA ushered in a golden age of defence and civilian innovation. Russia’s attack on Ukraine reinvigorated NATO, which now serves as a catalyst for Deep Tech like quantum via its Innovation Fund and accelerator programme DIANA, even as European nations and many others bolster the amounts dedicated to defence.

As we are talking in groups of three, let’s mention The City Quantum Summit’s three principles. Free for all, gender-balanced panels, and no lingo no jargon.

The Summit’s aim is to be accessible, inspirational and actionable. One can but wish the same for the quantum industry. Workshops with end users that are in clear, non-technical language are one of the best ways to drive quantum adoption – as is mention of the considerable efficiency and environmental gains.

However, let’s leave room for a little bit of the mind blowing, jaw-dropping, awe-inspiring craziness of quantum mechanics. The City Quantum Summit commissioned a futuristic film from artist Marina Landia on quantum and AI. Its score is quantum music created by composer Ilā using AWS Braket. Concurrent Realities will premiere now at the Mansion House, and then be taken off air to enter various film festival contests.

The madness reminds me of that story about explorer Christopher Columbus.

When he set out, he didn’t know where he was going.

When he got there, he didn’t know where he was.

When he returned, he didn’t know where he’d been.

There is more than a whiff of that in quantum.

Welcome to the Summit!


Links to the different panels at the Summit are below. TQI was the Media Partner, along with Quantum London.

Panel 1. Four different user cases for quantum: from pharma to cybersecurity, comms to finance.

Panel 2. Financing Quantum: dream or nightmare.

Panel 3: AI and Quantum: today and tomorrow

Panel 4: Cutting carbon footprints: Quantum for good

Keynote by the Institute of Physics and Quantum Exponential with IOP qBig Award winner

Karina Robinson is Founder of The City Quantum Summit and Senior Advisor to Multiverse Computing.

The Quantum Insider announce a strategic partnership with Quantum Key Stakeholder Assembly

By: TQI Admin
2 November 2023 at 15:27

We are excited to announce a strategic partnership with Quantum Key Stakeholder Assembly on Nov 9 in Barcelona in support of the proposal to the UN to proclaim 2025 as the International Year of Quantum Science & Technologies. 

The Quantum Assembly will be the avenue where global leaders in the quantum industry will present recommendations and their voices in support of the proposed IYQ2025 resolution to the United Nations.

The assembly will conclude with a ceremonial signing of an Open Letter to the UN signed by over 300 quantum leaders worldwide.

In partnership with the executive committee of IYQ2025, the “Quantum Key Stakeholder Assembly” will convene global decision-makers and leaders in the quantum industry to:

  1. Provide a platform to present the voice and recommendations of the quantum industry in preparation for the International Year of Quantum
  2.  Discuss 4 quantum-related perspectives and topics of relevance to the UN bodies such as UNICEF and ITU on issues such as SDGs, climate, education, gender equity and more

Quantum assembly is a part of PUZZLE X  (Nov 7-9) in Barcelona that brings together over 3000+ stakeholders and decision-makers from 82 countries. Among the speakers are:

– Sir Roger Penrose, Nobel Laureate, Oxford University

– Christian Larsson, CIO, UNICEF

– Dr. Bill Phillips, Nobel Laureate, NIST

– Sir Kostya Novoselov, Nobel Laureate, NUS

– Sir Robin Saxby, Founding CEO & Chairman, ARM

– Nabil Alnuaim, CEO, Aramco Digital

– Dr. Chema Alonso, Chief Digital Officer, Telefonica

The assembly will organize 4 closed-door sessions on central topics of relevance to the United Nations. Invited stakeholders will formulate recommendations and discuss harmonized global action that could help in addressing these issues during a proposed UN-proclaimed year of quantum. The topics are:

  • Equitable Advances in Quantum: Country Equity | Public private partnerships | Best-practice-sharing among regions and current and future stakeholders
  • Narrowing the Quantum Divide: Gender equity | Educational gap | Future talent & workforce development
  • Quantum-safe world : Quantum secure communication & global readiness
  • Towards the 2030 Agenda: Can Quantum technologies assist in reaching UN Sustainable Development Goals?

The Assembly will conclude with a ceremonial signing of an Open Letter by key international quantum stakeholders to encourage the United Nations to pass the resolution. To date, the letter has been signed by many leading figures in science such as Nobel laureates Bill Phillips, Sir Kostya Novoselov and 300 other major quantum stakeholders and companies. The letter remains open for signatures until Nov 1,2023.

Through this partnership, we invite our members to present their recommendations and views at the assembly. Request an invitation to be a part of these key discussions by clicking here.

Guest Post: Feed-Forward Error Correction And Mid-Circuit Measurements

By: TQI Admin
18 October 2023 at 11:32
physics, quantum physics, theory of relativity

By Pedro Lopes, Alexei Bylinskii
QuEra Computing

Beyond increasing the number of qubits or achieving longer quantum coherence times, mid-circuit readout is a critical factor in enhancing the capabilities of quantum computers.

Mid-circuit readout (MCR) refers to the ability to selectively measure the quantum state of a specific qubit or a group of qubits within a quantum computer without disrupting the ongoing computation. In some algorithms, these measurement results can also be reincorporated into the ongoing calculation, affecting it in a process known as “feed forward.” One of the most prominent applications of MCR is in quantum error correction for fault-tolerant quantum computers.


A circuit for the Bernstein-Vazirani algorithm. In contrast with the above, readout happens only once and for all qubits, after which the calculation terminates

The ease or difficulty of implementing MCR capabilities in a quantum computer depends on the chosen physical platform and architecture. At the time of this article, architectures based on superconducting qubits, trapped ions, and neutral atoms have all demonstrated MCR in a limited context. The neutral-atom platforms, in particular, are poised to apply MCR for error correction at scale, and in the following discussion, we will explore the strengths and challenges of MCR for neutral atoms.

Presently, neutral-atom quantum computers determine calculation results through a destructive qubit readout process: atoms are illuminated with laser light. Those in state 1 are ejected, and those in state 0 emit fluorescence which is detected on a camera. The camera image is processed to determine a given qubit state based on the brightness of pixels corresponding to it. In an optimized system, this process detects the correct qubit state with 99.9% probability. If one wants to use this process for MCR as is, all that is necessary is to target individual atoms with the laser light that generates fluorescence.

There are two difficulties this setup introduces. The first is the loss of qubits in state 1. This does not scale well for applications where many rounds of MCR are required, such as in most error-correction protocols. The second problem is that the light scattered in the imaging process can impinge on the atoms that were not targeted by the imaging laser beams for readout, introducing decoherence and preventing the quantum computation from continuing. For error-correction applications, in particular, this kind of readout destroys the ‘ancilla’ qubits (meant to be detected) and destroys the superposition of ‘data’ qubits (meant to continue the quantum computation forwards after the MCR).

Both problems can be solved by leveraging the capacity of shuttling atoms around with the same type of lasers (“optical tweezers”) that hold them in the first place. That way, qubits to be measured can be shuttled to a dedicated readout zone, well-separated from the remainder qubits, where they can safely scatter the imaging light without perturbing the computation. To complete the solution after the MCR, one can again move pre-loaded and pre-initialized atoms from a separate reservoir to replace the ejected measured atoms. Particularly at scale, when many atoms are tightly packed, there is arguably no solution more elegant than this.

Atom transport of this kind has recently been shown to preserve qubit coherence and to be highly parallelizable in 2D. It is fast enough to allow more than 10,000 shuttling operations within the coherence time of a neutral atom qubit, and it has enough range to span more than 10,000 qubits packed in 2D. Together with the zoned layout, this enables a QPU architecture with vastly simplified control requirements, optimized performance for each computational primitive, and starts to resemble classical CPUs. Drawing inspiration from classical computer architecture, which has separate memory and computation units, such a quantum architecture can consist of three different functional zones: storage, entangling, and readout. The storage zone is used for qubit storage, featuring long coherence times; the entangling zone is used for high-fidelity parallel entangling gates, operating on the qubits; and the readout zone gives us the capability to measure the quantum state of a subset of qubits without affecting the quantum state of other qubits, which is critical for error correction.

In the context of MCR, the number of ancilla that can be moved into the readout zone and detected simultaneously can be optimized for the algorithm at hand, and the isolation of the readout zone from the processing zone can be optimized for the desired performance on ancilla detection fidelity and on data qubit coherence.

The real power of MCR comes with the alluded feed-forward processes. These come with challenging requirements on the part of classical data processing. Raw data needs to be read in, processed in the context of the algorithm or error correcting code at hand (also known as ‘decoding’), then control signals destined for the data qubits need to be updated, and the whole loop must be executed fast and many times over. For fault tolerance, the correction of errors implemented by this process must win the race against decoherence, putting stringent requirements on the latency of this pipeline, which can be addressed with classical firmware that is tightly coupled to the data acquisition and control signal generation devices.

While many of these aspects still need to be engineered into a cohesive system and useable platform to be made available to developers and users, they are relatively close for neutral atom architectures. With the right approach, neutral atom quantum computers can become truly scalable quantum machines, enabling a wide range of new algorithms and applications.

Guest Post — Unlocking the Quantum Frontier: Coherence in Neutral-Atom Systems

By: TQI Admin
3 October 2023 at 15:46
physics, quantum physics, particles

Guest post by Alex Lukin, Tommaso Macri, QuEra Computing Inc.

Quantum computing holds the promise of revolutionizing information processing. At its core are qubits, the fundamental units of quantum information. However, there’s a catch: qubits are astonishingly delicate and require careful shielding from their surroundings. Understanding and extending their coherence time—the duration they maintain their quantum state—is paramount.

The Quantum Lifespan

Coherence time is a fundamental metric in the world of quantum computing. It’s akin to the lifespan of a qubit, measuring how long, on average, this object can preserve its intricate quantum state.

The main difference between qubits and classical bits lies in their ability to exist in a superposition of cardinal states and, more importantly, the presence of phase between those states. This phase is what grants qubits their superior computing power over classical bits. However, it also renders qubits fragile compared to classical bits, as any disturbance from the environment can lead to the collapse of superpositions and scrambling of phase over time. Coherence time quantifies how long a qubit can maintain its superposition and phase coherence properties.

Since we’re tracking two distinct components (superposition and phase coherence), there are two separate values describing the qubits: T1 and T2. T1 indicates how long it takes for the amplitude of the superposition to decay, or in other words, how long it would take for a qubit initially in state 1 to decay to 0.  On the other hand, T2 reveals how long it takes for a phase between states in superposition to shift. It turns out to be an extremely sensitive quantity as even minuscule disturbances in the qubit’s environment result in these phase shifts, causing the qubit to lose its quantum information over time. Consequently, it’s often referred to as “qubit coherence” without additional qualifiers. T2 is always limited by T1, in some platforms, such as superconducting qubits, this limit has been reached, whereas in others (neutral atoms and ions) T1 is much greater than T2 which give a much easier path for improvement by addressing the sources of technical noise in the system.  Some emerging quantum platforms, such as topological qubits, aim to shift the paradigm by encoding phase information into distributed degrees of freedom of the system, requiring correlated noise throughout the system to cause unlikely errors. Unfortunately, a robust demonstration of such a qubit remains elusive to date.

It’s important to note that coherence time in absolute physical units (e.g., seconds or microseconds) is less relevant as a metric for comparing platforms. The reason is that we want to perform operations on the qubits, specifically two-qubit entangling operations, which vary drastically in duration across different platforms. Therefore, the relevant metric is the number of two-qubit operations achievable within the coherence time, which can be determined by the ratio of T2 to the gate operation duration. For instance, while superconducting qubits might have only 30µs of coherence time compared to 1000µs for trapped ion systems, once we consider the duration of the gate, superconductors can perform nearly twice as many operations within the coherence time compared to ions.

The number of gate operations is one area where neutral-atoms computers might have a distinct advantage.

Coherence times in neutral atoms

Let us delve deeper into the realm of neutral atom systems. The most common method of encoding of qubit states in these systems employ hyperfine states. Hyperfine states are discrete energy levels within an atom’s electronic structure. They exhibit remarkable stability and resilience against environmental perturbations, making them prime candidates for qubit encoding. For neutral rubidium atoms, qubit states encoded in the ground state correspond to electronic states with a principal quantum number of n=5, with no net orbital angular momentum and two different hyperfine states (labeled with the quantum number F, for example, F=1 and F=2). Two-qubit gates are achieved by coupling one hyperfine state to a highly-excited electronic state (n>>10) called the Rydberg state. Large principal quantum numbers ensure strong interactions between the atoms and, consequently, fast entangling gates.

In neutral rubidium atoms, qubit states are encoded in the ground electronic state, which is characterized by a principal quantum number of n=5. The principal quantum number is a crucial quantum mechanical attribute that specifies the energy level of an electron in an atom. It essentially defines the “shell” in which an electron resides, and higher values of n correspond to higher energy levels and greater distances from the nucleus. In the case of rubidium, the ground state has a principal quantum number of n=5, indicating that the electrons in this state are in the fifth energy shell.

Comparing various modalities

In the table below, we compare the absolute coherence times, two-qubit gate durations, and the number of operations for the major quantum computing platforms: Ions, superconducting systems, and neutral atoms. Superconductors facilitate swift operations, albeit with a much smaller T2 coherence time. Neutral atoms operate at a lower rate but surpass other platforms in total operations.

In fairness, it is important to note that while neutral atom platforms offer unique advantages such as high qubit connectivity and long coherence times, they also come with some inherent challenges, particularly concerning cycle times. One significant drawback is the time required to arrange individual atoms into a specific geometric configuration using optical tweezers or magnetic traps. This process can be time-consuming and may limit the overall speed of quantum operations. Also, using camera-based detection systems to analyze each atom’s state introduces another layer of latency. The camera needs to capture and process images to read out the quantum states, and this image analysis can be both computationally intensive and time-consuming. As a result, the overall cycle time—the time it takes to prepare, execute, and read out a quantum operation—can be considerably longer compared to other quantum computing platforms like superconducting qubits or trapped ions.


In conclusion, the pursuit of quantum computing hinges on the delicate balance between qubit coherence and operational speed. Coherence times stand as critical parameters, reflecting a qubit’s ability to maintain superposition and phase coherence. While superconductors offer rapid operations, their limited T2 coherence time poses a challenge. In contrast, neutral atom systems, leveraging stable hyperfine states, demonstrate impressive T2 coherence times and excel in the total number of operations performed within this window. This distinction, among other relevant features, positions neutral atoms as a promising frontier in quantum computing, offering a compelling avenue for future advancements in this transformative field




Superconducting qubits:

Neutral atoms:


PUZZLE X hosts “Quantum Key Stakeholder Assembly” In Support of the Proposal to the UN to Proclaim 2025 The International Year of Quantum Science & Technology

By: TQI Admin
27 September 2023 at 09:31
PUZZLE X hosts “Quantum Key Stakeholder Assembly

On Nov 9 in Barcelona, PUZZLE X convenes global quantum industry stakeholders to present the industry recommendations in preparation for the passing of the International Year of Quantum resolution. The Assembly will conclude in a ceremonial signing of an open letter to the UN signed by over 300 quantum leaders worldwide 


PUZZLE X, the international forum on Exponential Technologies held annually in Barcelona is hosting a first-of-its-kind high-level quantum assembly in support of the proposed resolution to the United Nations to proclaim 2025 as the International Year of Science and Technologies (IYQ2025).   In partnership with the executive committee of IYQ2025, the 1-day “Quantum Key Stakeholder Assembly” will convene global decision makers and leaders in quantum to discuss key topics of relevance to the UN bodies such as UNICEF and ITU and more importantly, provide a platform to present the voice and recommendations of the ecosystem and industry in preparation for the passing of the International Year of Quantum.

The Assembly will conclude in a ceremonial signing of an Open Letter by key international quantum stakeholders to encourage the United Nations to pass the resolution. To date, the letter has been signed by many leading figures in science such as Nobel laureates Bill Phillips, Sir Kostya Novosleov and 300 other major quantum stakeholders and companies. The letter remains open for signatures until Nov 1,2023.

The assembly will organize 4 closed-door sessions on central topics of relevance to the United Nations namely: 

  • Equitable Advances in Quantum: Country Equity | Public private partnerships | Best-practice-sharing among regions and current and future stakeholders
  • Narrowing the Quantum Divide: Gender equity | Educational gap | Future talent & workforce development
  • Quantum-safe world : Quantum secure communication & global readiness
  • Towards the 2030 Agenda: Can Quantum technologies assist in reaching UN Sustainable Development Goals?

Invited stakeholders will formulate recommendations and discuss harmonized global action that could help in addressing these issues during a proposed UN proclaimed year of quantum. 

The “Quantum Key Stakeholder Assembly” is a part of PUZZLE X 2023, the world leading event on Exponential Technologies which convenes over 3000+ stakeholders and decision-makers form 82 countries to discuss advances in Exponential Technologies such as quantum and the impact on the future of industries, cities, and societies. Among PUZZLEX speakers are:

  • Sir Roger Penrose, Nobel Laureate, Oxford University 
  • Dr. Bill Phillips, Nobel Laureate, NIST 
  • Sir Kostya Novoselov, Nobel Laureate, NUS
  • Sir Robin Saxby, Founding CEO & Chairman, ARM 
  • Dr. Hiroshi Ishii, Vice Director of MIT Media Lab 
  • Nabil Alnuaim, CEO, Aramco Digital
  • Dr. Chema Alonso, Chief Digital Officer, Telefonica

To request an invitation to attend the Quantum Key Stakeholder Assembly and to influence the discussions and recommendations for, Click HERE.


Quantum Insider and PUZZLE X have strategically partnered to inform and engage the global quantum industry at the “Quantum Stakeholder Assembly” and supporting the IYQ2025. 

From Labs to Leadership: The Quantum Insider Debuts Comprehensive Marketing Guide for Quantum Tech Companies

By: TQI Admin
22 September 2023 at 10:07

The teams with the best technology may not be the ones who eventually usher in the quantum era and succeed in bringing the promise of exponential computing to reality. The quantum industry will be driven by the teams and companies with the best technology and business strategy — ones who can convey the advantages of their approaches in clear, effective and efficient ways. As quantum moves from the academic world to the real world, marketing is shifting from “nice to have” to “necessary.” 

The Quantum Insider has created a comprehensive guide to digital and content marketing for quantum technology companies: elucidating how quantum enterprises can move from R&D to being the thought leader within their respective niche.

The Quantum Insider, the leading media, data and business intelligence service in the quantum industry, has released the “Marketing Guide For Quantum Technology Companies.” The guide underscores the pivotal role of digital and content marketing in empowering quantum technology companies to support their innovation and research activities: helping quantum technology entities to communicate their groundbreaking findings.

Digital and content marketing have evolved into indispensable tools for companies in the quantum industry to reach their audience effectively and highlight where quantum technology can improve existing activities while establishing thought leadership and driving business growth.

“The quantum industry is on the brink of transformation, and marketing plays a crucial role in shaping its narrative. Our ‘Marketing Guide For Quantum Technology Companies’ is a must-have resource for businesses aiming to stand out and make an impact in this dynamic sector,” said Alex Challans, CEO of The Quantum Insider.

The “Marketing Guide For Quantum Technology Companies” is a practical step-by-step manual designed to assist quantum technology companies in establishing an effective marketing strategy while positioning themselves as experts within their respective industries. Illustrating how quantum technology companies go beyond technical jargon and verbosity to communicate quantum technology and its promises to the ones who will benefit from it.

This guide offers a holistic approach to digital marketing, addressing the challenges and opportunities unique to the quantum sector. It provides eight essential tips that many quantum technology companies often overlook when designing and executing their digital and content marketing strategies. These insights are invaluable for quantum businesses aiming to maximize their marketing efforts and go-to-market strategies in a rapidly evolving and promising industry.

One of the key highlights of the guide is the detailed exploration of 24 content marketing types segmented into seven categories. By delving into these diverse content options, quantum technology companies can tailor their bespoke digital marketing strategies to engage their target audience effectively and directly. Whether it’s blog posts, white papers, webinars, infographics, partner content, newsletters, or SEO, this guide offers a roadmap to creating compelling content that resonates with the quantum community and specific audiences unique to you.

“The quantum technology companies that can effectively communicate their groundbreaking quantum solution are the ones that will be able to move from labs to markets. Our detailed marketing guide aims to help quantum players understand the tools to achieve exactly that.” said Jakob Pii, Content Strategist of The Quantum Insider

Quantum technology companies, researchers and enthusiasts can now access this comprehensive marketing guide to navigate the evolving landscape of digital and content marketing within the quantum industry. Communicate the technical possibilities and elevate your marketing strategies with the invaluable insights and practical advice presented in this guide.

For more information and to download the guide, please go here.

Guest Post: Quantum Error Correction – The Key to Realizing Quantum Computing’s Potential

By: TQI Admin
8 September 2023 at 10:52
physics, quantum physics, particles

Guest Post by Yuval Boger, Chief Marketing Officer, QuEra Computing


In the rapidly evolving world of quantum computing, quantum error correction stands as a critical linchpin to the successful operation of these advanced machines.

Quantum error correction is important because it addresses the inherent instability of qubits (quantum bits). Qubits are incredibly sensitive to their environment. This sensitivity makes quantum systems an excellent candidate for building highly precise sensors but poses a significant challenge for building quantum computers; even the slightest environmental disturbance can cause a qubit to lose its quantum state, leading to computational errors.

Where do errors come from?

Some common sources of errors in quantum computing are:

  • Phase flip error (dephasing): A qubit in a superposition of |0⟩ and |1⟩ states can be disturbed by its environment, causing it to lose the phase information of its quantum state. This is known as a phase flip or dephasing error. The ratio of the dephasing time (how long the qubit can maintain its phase information) to the time required to perform a single quantum operation is a crucial parameter in quantum computing. This ratio, often called “Quantum Coherence Ratio” or “Qubit Quality Factor”, essentially determines how many operations can be performed before the accumulation of errors renders the results unreliable.
  • Bit flip errors (depolarization): Various environmental factors, such as thermal vibrations, electromagnetic waves, and even cosmic rays, can cause qubits to flip from their |0⟩ state to |1⟩ state, or vice versa. This is known as a bit flip error, and it’s a type of depolarizing error. These errors can lead to computational inaccuracies.
  • Gate Operation Errors: Quantum gates are used to manipulate qubits during a quantum computation. However, these operations are not always perfect and can introduce errors. This is measured both for gates operating on a single qubit, as well as two-qubit gates. The current state of the art in two-qubit gate fidelity is around 99.9%, which means that approximately one in every 1000 two-qubit operations will result in an error.

Complex calculations typically require more time, qubits and gates, and thus decoherence, noise and gate errors limit the complexity of quantum algorithms. While today’s qubits are approaching an error rate of 10-3, large-scale quantum algorithms require much lower error rates, in the order of 10-10 to 10-15. Since the gap is about 12 orders of magnitude, we need a paradigm shift; continued improvements in qubit error rates won’t be enough to unleash the potential of quantum. This is where error correction comes in.

Error mitigation and error correction

To reduce the impact of errors, even before implementing error correction, several vendors are working on error mitigation techniques. Error mitigation techniques often involve identifying and minimizing the effect of errors on the results of quantum computations by using statistical methods. For instance, techniques like error averaging perform quantum computations multiple times and then averages the results. Post-processing aims to improve the results of a quantum computation after it has been performed to reduce the impact of errors through statistical bootstrapping and resampling. Noise extrapolation involves running the same quantum computation multiple times with varying levels of artificially added noise, and then comparing the results to extrapolate a ‘noise-free’ result.

Unlike error mitigation, error correction aims to detect and correct errors directly. Quantum error correction (QEC) codes are designed to protect quantum information. This is where the key concept of logical qubits come in. A logical qubit is a quantum bit of information that is protected from errors by encoding it across multiple physical qubits. The term “logical” is used to distinguish it from the “physical” qubits that make up the computer’s hardware. The logical qubit is the unit of quantum information that we are ultimately interested in preserving and manipulating. In essence, we are trying to go from many physical qubits with 10-3 or 10-4 error rates to fewer logical qubits with 10-10 or 10-15 error rates.

The concept of a logical qubit has a parallel in classical error correction methods. In classical computing, a logical bit is a bit of information that is protected from errors by encoding it across multiple physical bits. For example, in a simple repetition code, a logical bit of ‘0’ might be encoded as ‘000’, and a logical bit of ‘1’ as ‘111’. If an error causes a flip of one physical bit, say from ‘000’ to ‘100’, we can still correctly infer the logical bit by looking at the majority of the physical bits.

However, there’s a crucial difference between classical and quantum that makes QEC much more challenging: due to the no-cloning theorem in quantum mechanics, qubits cannot be simply replicated, unlike classical bits that can be easily replicated. Thus, QEC must instead use more complex encodings that allow errors to be detected and corrected.

There are several prototypical codes that have been widely used for quantum error correction.

  • Shor’s code, proposed by Peter Shor (yes, the same Shor as in Shor’s algorithm) in 1995, encodes a single logical qubit into nine physical qubits and can correct for arbitrary errors in a single qubit.
  • Steane’s code encodes one logical qubit into seven physical qubits and has a convenient set of logical gate operations as well as efficient methods for good state preparation.
  • Surface codes work by arranging physical qubits on a two-dimensional grid to encode a single logical qubit. Surface codes can correct for errors that affect up to floor(sqrt(n)/2), where n is the number of physical qubits that are part of that surface code. Surface codes are appealing, compared to other known codes, as they as they have one of the highest known circuit-level error correction thresholds.
  • Topological and qLDPC codes are general categories of codes where qubits are arranged in more complex structures. These codes are a promising area of research, but the increased complexity makes them more challenging to use. Recent publications from IBM (paper here) and a group of collaborators from University of Chicago, Harvard, Caltech, University of Arizona and QuEra (paper here) have shown that new qLDPC codes could require only one-tenth the number of input qubits compared to the surface codes, and thus are a promising area of research.

Surface codes are often paired with a technique called ‘lattice surgery’, which is a way to perform logical operations between qubits on a 2D planar architecture. The term “lattice surgery” comes from the way logical qubits are visualized in these topological codes. The logical qubits are often represented each as a square checkerboard “lattice”, with logical information contained in each. “Surgery” refers to the process of manipulating these lattices to perform logical operations.

In lattice surgery, we manipulate logical operations by merging and dividing these lattice structures. Imagine this as combining two checkerboards to form a larger one or splitting one to create two smaller ones. When we merge, we’re performing a quantum measurement, and when we split, we’re creating quantum entanglement. To illustrate, if we want to perform a logical CNOT operation between two logical qubits (each housed in its own lattice), we create a third, temporary lattice. This temporary lattice is then merged and divided with the two logical qubits in turn. A significant benefit of lattice surgery is its compatibility with a 2D planar layout, making it suitable for many platforms that only allow short-range connections on a flat chip.

How many physical qubits does one need?

The number of qubits needed depends on the quality of the qubits and the fidelity of the gates that operate on them. On one extreme, if the physical qubits and quantum gates were perfect, a logical qubit would need only one physical qubit because no error correction is needed. Conversely, when the error rates increase, each logical qubit needs a larger and larger number of physical qubits to function properly. This is analogous to the classical intuition: if the classical bit-flip error rate is high, we might need a 5- or 7-bit repetition codes, whereas if the error rate is low, a 3-bit repetition code might suffice. In today’s error rates, a high-quality logical qubit might require 100-1000 physical qubits, depending on the desired resilience to errors.

The number of qubits also depends on the specific application. For instance, simulating complex molecules might require more qubits than solving certain physics problems that have natural hardware mappings.

A 2022 paper by Microsoft researchers titled “Assessing requirements to scale to practical quantum advantage” maps that tradeoff between error rate, and size for various applications. It also compares the expected runtime of such an algorithm as a function of the time to complete each quantum operation (this time varies greatly between different quantum implementations). We can see, for instance, that if the error rate and cycle time are low, a quantum chemistry algorithm might require just over a million qubits and about a month to complete, whereas with a higher error rate, that same algorithm might require five million qubits and  several month.

Current state of the art

The amount of intensive corporate and academic research into QEC is hardly surprising, considering how important QEC is for the future of quantum computing. Despite very impressive results, it is clear that we are still a long way from practical and usable QEC.

Here are some recent state-of-the-art results:

A Nature publication from the Google Quantum AI team reports a demonstration of a single logical qubit that uses a surface code to suppresses errors, specifically showing that error rates became lower with greater code distance (code distance is the minimum number of errors that would cause an undetectable logical error). The team also highlighted the challenges it faced, such as the need for high-fidelity qubits and gates, precise calibration, and fast, low-latency classical control and readout.

Another Nature publication, this time from the University of Innsbruck, demonstrated a fault-tolerant universal set of gates on two logical qubits in a trapped-ion quantum computer. They observed that the fault-tolerant implementation showed suppressed errors compared to a non-fault-tolerant implementation.

An IBM team published in the ArXiv a demonstration of an error-suppressed encoding scheme using an array of superconducting qubits. The error-suppressed creation of a specific entangled quantum state demonstrated a fidelity exceeding that of the same unencoded state on any pair of physical qubits on the same device.

Another ArXiv publication, this time from a Quantinuum team, compared two different implementations of fault-tolerant gates on logical qubits on a twenty-qubit trapped-ion quantum computer. This report also found that the fidelities of the two-logical-qubit gate exceeded those of a two physical-qubit gate.

Last, this Arxiv paper from a Yale group demonstrates the use of quantum error correction to counteract decoherence, showing that they are able to remove errors faster than the rate these errors corrupt the stored quantum information.

Scalability concerns

One reason that recent reports typically demonstrate only one or two logical qubits is the limited number of physical qubits available on current quantum machines. Even the most advanced systems, such as IBM’s Seattle machine and QuEra Computing’s Aquila machine, possess only a few hundred qubits each.

Scaling the number of physical qubits in quantum computing presents a significant challenge: managing the multitude of control signals required to operate the millions of physical qubits needed to generate an adequate number of logical qubits for executing meaningful operations. For instance, Google’s impressive 72-qubit quantum computer requires three controls per qubit, leading to hundreds of high-performance control signals for the entire system. In stark contrast, a high-end classical computer, despite housing billions of transistors, only necessitates approximately 1000 external controls. Even a 4K television screen, with over eight million pixels, operates with just a few thousand control signals. So, how can we scale quantum computers without the need for millions of control signals?

A recent study published in Nature by researchers from Harvard and QuEra offers a promising approach using a neutral-atom computer. In this setup, each qubit is represented by a single atom. An array of laser beams traps multiple atoms and performs quantum operations on them by illuminating nearby atoms with light at specific wavelengths. The researchers introduced a method for transporting groups of qubits (for example, a logical qubit) between different zones, each serving a specific function—long-term memory, processing, or measurement and readout. Consequently, control signals for processing are only needed in the processing area, irrespective of the total number of qubits in the system. Moreover, because the light required for a quantum operation can simultaneously illuminate a group of qubits, quantum operations on logical qubits could potentially be performed concurrently by shining light on all the participating physical qubits. This innovative approach is illustrated in the following diagram and further explained in the accompanying video [see here from Harvard, originally published in Nature].

Source: Nature, Apr 20, 2022

Looking ahead

As we track the progress of several groups, it’s reasonable to expect significant advancements in the coming months and years. Some possible manifestations of such progress:

  • Additional improvements in the fidelity of single- and two-qubit gates and coherence time.
  • Innovative approaches to construct logical qubits more efficiently.
  • Demonstrations of simple algorithms that use more than two logical qubits.
  • Demonstration of larger code distances, showing logical qubits thar are increasingly robust.
  • Additional demonstrations of the superior performance of logical qubits relative to their physical counterparts.


In the rapidly evolving landscape of quantum computing, quantum error correction (QEC) stands as an unsung hero, quietly underpinning the successful operation of these advanced machines. As we continue to push the boundaries of quantum computing, the role of QEC will only become more crucial. It is the silent guardian that will enable us to navigate the quantum landscape, ensuring the stability and reliability of our quantum computations. In the quest for quantum advantage, QEC is not just an unsung hero; it is an indispensable ally.

Yuval Boger is the chief marketing officer for QuEra, a leader in neutral atom quantum computers.

Quantum Enigma 006: The Train Challenge

By: TQI Admin
5 September 2023 at 14:47

Institut quantique recently uploaded its 6th Quantum Enigma : The Train Challenge.

You are on a train with your friend Bob, who is another carriage. As soon as the train starts moving, you realize that it’s caught in an infinite loop and isn’t heading towards your destination. To get out of the loop, you’ll need to find a strategy to win a coin toss thrown randomly by Kettu, with no way of communicating with your friend Bob.