Diagram Showing How Q-CTRL Embedded Software is Integrated with IBM Quantum Q-CTRL and IBM have announced that Q-CTRLs error suppression technology, named Q-CTRL Embedded, has been integrated into the IBM Qiskit runtime system. This feature is currently only available for users on the Pay-As-You-Go plan. There is no additional cost for those users who utilize [...]

By Dr Chris Mansell, Senior Scientific Writer at Terra Quantum Shown below are summaries of a few interesting research papers in quantum technology that we have seen over the past month. Hardware Title: Electron charge qubit with 0.1 millisecond coherence timeOrganizations: Argonne National Laboratory; University of Chicago; Lawrence Berkeley National Laboratory; The NSF AI Institute [...]

Matt Versaggi, a senior leader in the AI space, is interviewed by Yuval Boger. Matt discusses his grassroots approach to integrating quantum technology in healthcare, emphasizing the need for educational programs and patenting initiatives. He also shares insights into the role of quantum business strategists, who can offer cost-effective, timely advice to organizations. Matt also [...]

In June 2023, we offered how quantum computing must graduate through three implementation levels (quantum computing implementation levels QCILs) to achieve utility scale: Level 1 Foundational, Level 2 Resilient, Level 3 Scale. All quantum computing technologies today are at Level 1. And while NISQ machines are all around us, they do not offer practical quantum advantage. True utility will only come from orchestrating resilient computation across a sea of logical qubits something that, to the best of our current knowledge, can only be achieved with error correction and fault tolerance. Fault tolerance will be a necessary and essential ingredient in any quantum supercomputer, and for any practical quantum advantage.The first step toward the goal of reaching practical quantum advantage is to demonstrate resilient computation on a logical qubit. However, just one logical qubit will not be enough; ultimately the goal is to show that quantum error correction helps non-trivial computation instead of hindering, and an important element of this non-triviality is the interaction between qubits and their entanglement. Demonstrating an error corrected resilient computation, initially on two logical qubits, that outperforms the same computation on physical qubits, will mark the first demonstration of a resilient computation in our field's history.The race is on to demonstrate a resilient logical qubit but what is a meaningful demonstration? Before our industry can declare a victory on reaching Level 2 for a given quantum computing hardware and claim the demonstration of a resilient logical qubit, it's important to align on what this means.Criteria of Level 2: resilient quantum computation

How should we define a logical qubit? The most meaningful definition of a logical qubit hinges on what one can do with that qubit demonstrating a qubit that can only remain idle, that is, be preserved in a memory, is not meaningful if one cannot demonstrate non-trivial operations as well. Therefore, it makes sense to define a logical qubit such that it allows some non-trivial, encoded computation to be performed on it.Distinct hardware comes with distinct native operations. This presents a significant challenge in formally defining a logical qubit; for example, the definition should not favor one hardware over another. To address this, we propose a set of criteria that mark the entrance into the resilient level of quantum computation. In other words, these are the criteria for calling something a "logical qubit".Entrance criteria to Level 2Exiting Level 1 NISQ computing and entering Level 2 Resilient quantum computing is achieved when fewer errors are observed on the output of a logical circuit using quantum error correction than on the same analogous physical circuit without error correction.We argue that a demonstration of the resilient level of quantum computation must satisfy the following criteria:

involve at least 2 logical qubitsdemonstrate convincingly large separation (ideally 5-10x) of logical error rate < physical error rate on the non-trivial logical circuitcorrect all individual circuit faults ("fault distance" must be at least 3)implement a non-trivial logical operation that generates entanglement between logical qubitsThe justification for these is self-evident being able to correct errors is how resiliency is achieved and demonstrating an improvement over physical error rates is precisely what we mean by resiliency but we feel that it is worth emphasizing the requirement for logical entanglement. Our goal is to achieve advantage with a quantum computer, and an important ingredient to advantage is entanglement across at least 2 logical qubits.The distinction between Resilient Level and the Scale Level is also important to emphasize a proof of principle demonstration of resiliency must be convincing, but it does not require a fully scaled machine. For this reason, we find it important to allow some forms of post-selection, with the following requirements

Post-selection acceptance criteria must be computable in real-time (but may be implemented in post-processing for demonstration);scalable post-selection (rejection rate can be made vanishingly small)if post-selection is not scalable, it must at least correct all low weight errors in the computations (with the exception of state-preparation, since post-selection in state-preparation is scalable);In other words, post-selection must be either fully compatible with scalability, or it must still allow for demonstration of the key ingredients of error correction, not simply error detection.Measuring progress across Level 2Once a quantum computing hardware has entered the Resilient Level, it is important to also be able to measure continued progress toward Level 3. Not every type of quantum computing hardware will achieve Level 3 Scale, as the requirements to reach Scale include achieving upwards of 1000 logical qubits with logical error rates better than 10-12 and mega-rQOPS and more.Progress toward scale may be measured along four axes: universality, scalability, fidelity, composability. We offer the following ideas to the community on how to measure progress across these four axes, such that we as a community can benchmark progress in the resilient level of utility scale quantum computation:

Universality: universality typically splits into two components: Clifford group gates and non-Clifford group gates. Does one have a set of high-fidelity Clifford-complete logical operations? Does one have a set of high-fidelity universal logical operations? A typical strategy employed is to design the former, which can then be used in conjunction with a noisy non-Clifford state to realize a universal set of logical operations. Of course, different hardware may employ different strategies.Scalability: At its core, resource requirement for advantage must be reasonable (i.e., small fraction of Earth's resources or a person's lifetime). More technically, quantum resource overhead required should scale polynomially with target logical error rate of any quantum algorithm. Note also that some systems may achieve very high fidelity but may have limited numbers of physical qubits, so that improving the error correction codes in the most obvious way (increasing distance) may be difficult.Fidelity: Logical error rates of all operations improve with code size (sub-threshold). More strictly, one would like to see logical error rate is better than physical error rate (sub-pseudothreshold). Progress on this axis can be measured with Quantum Characterization Verification & Validation (QCVV) performed at the logical level, or with other operational tasks such as Bell inequality violations and self-testing protocols.Composability: Composable gadgets for all logical operations. Criteria to advance from Level 2 to Level 3, a Quantum SupercomputerThe exit of the resilient level of logical computation, and the achievement of the world's first quantum supercomputer, will be marked by large depth computations on high fidelity circuits involving upwards of hundreds of logical qubits. For example, a logical circuit on ~100+ logical qubits with a universal set of composable logical operations hitting a fidelity of ~10e-8 or better. Ultimately, a quantum supercomputer will be achieved once the machine is able to demonstrate 1000 logical qubits with logical error rate of 10^-12 and a mega-rQOPS. Performance of a quantum supercomputer can then be measured by reliable quantum operations per second (rQOPS).Conclusion

It's no doubt an exciting time to be in quantum computing. Our industry is at the brink of reaching the next implementation level, Level 2, which puts our industry on path to ultimately achieving practical quantum advantage. If you have thoughts on these criteria for a logical qubit, or how to measure progress, we'd love to hear from you.

Riverlane has been selected for the next phase of DARPA’s Quantum Benchmarking program.

The program’s aim is to design key quantum computing metrics.

Riverlane will be working with top tier universities such as the University of Southern California and the University of Sydney.

PRESS RELEASE — Riverlane has been selected for Phase 2 of the Quantum Benchmarking program funded by the Defense Advanced Research Projects Agency (DARPA).

The aim of the DARPA Quantum Benchmarking program is to design key quantum computing metrics for practically relevant problems and estimate the required quantum and classical resources needed to reach critical performance thresholds.

Steve Brierley, CEO and Founder of Riverlane, said: “Riverlane’s mission is to make quantum computing useful sooner, starting an era of human progress as significant as the industrial and digital revolutions. The DARPA Quantum Benchmarking program aligns with this goal, helping the quantum community measure progress and maintain momentum as we reach unlock quantum error correction and enable fault tolerance.”

Fault tolerance is increasingly seen as a requirement for reaching useful quantum advantage. To achieve this, the errors that quantum bits (qubits) are prone to must be corrected. Simply put, quantum error correction is the enabling technology for fault tolerance.

Hardware companies, academic groups and national labs have demonstrated significant progress with small quantum error-corrected systems, but there remain many challenges for controlling fault-tolerant devices at scale.

In the DARPA Quantum Benchmarking project, Riverlane is working with top tier universities such as the University of Southern California and the University of Sydney to identify important benchmarks for practical problems especially in the fields of plasma physics, fluid dynamics, condensed matter and high energy physics. The team is building tools to estimate the quantum and classical resources needed to implement quantum algorithms to solve the benchmark problems at scale.

Hari Krovi, Principal Quantum Scientist at Riverlane, explained: “Fault tolerance will result in significant overheads, both in terms of qubit count and calculation time and it is important to take this into consideration when comparing to classical techniques. It has been known for some time that mild speed-ups such as a quadratic speed-up can disappear when the fault tolerance overhead is considered. There are many different approaches to fault tolerance to consider and each one leads to overheads that can vary by many orders of magnitude.”

Krovi added: “One area of consideration is the choice of quantum code to help identify and correct errors in the system. There are many different choices that lead to fault tolerance and each of these leads to different overheads. The Surface Code is a popular choice, and the team is focussing on estimates based on this approach.”

The work being done in this program provides a quantitative understanding of practical quantum advantage and can inform whether and how disruptive quantum computing is in various fields.

Amazon Web Services (AWS) has introduced a new quantum computer chip focused on enhancing error correction.

The company said that the chip, which is fabricated in-house, can suppress bit flip errors by 100x using a passive error correction approach.

By combining both passive and active error correction approaches, the chip could theoretically achieve quantum error correction six times more efficiently than standard methods.

Image: Peter Desantis, senior vice president of AWS utility computing products. Credit: AWS

Amazon Web Services (AWS) has introduced a new quantum computer chip focused on enhancing error correction, a pivotal — if not the pivotal — aspect in the evolution of quantum computing. Peter DeSantis, Vice President of Global Infrastructure and Customer Support at AWS, detailed the features and implications of this development in a keynote address in Las Vegas at AWS’s re:Invent conference for the global cloud computing community.

The AWS team has been working on a custom-designed quantum device, a chip totally fabricated in-house, which takes an innovative approach to error correction, according to DeSantis.

“By separating the bit flips from the phase flips, we’ve been able to suppress bit flip errors by 100x using a passive error correction approach. This allows us to focus our active error correction on just those phase flips,” DeSantis stated.

He highlighted that combining both passive and active error correction approaches could theoretically achieve quantum error correction six times more efficiently than standard methods. This development represents an essential step towards creating hardware-efficient and scalable quantum error correction.

In a LinkedIn post, Simone Severini, general manager of quantum technologies at AWS, writes that AWS’s logical qubit is both hardware-efficient and scalable.

He writes that the chip uses a special oscillator-based qubit to suppresses bit flip errors. A much simpler outer error-correcting code protects the phase flip errors.

Severini added, “It is based on a superconducting quantum circuit technology that “prints” qubits on the surface of a silicon microchip, making it highly scalable in the number of physical qubits. This scalability allows one to exponentially suppress the total logical error rate by adding more physical qubits to the chip. Other approaches based on similar oscillator-based qubits rely on large 3D resonant cavities, that need to be manually pieced together.”

Error Correction Progress

DeSantis said that the effort on error correction is important because, despite advancements, qubits remain too noisy for practical use in solving complex problems.

“15 years ago, the state of the art was one error every 10 Quantum operations. Today, we’ve improved to about one error per 1000 Quantum operations. This 100x improvement in 15 years is significant. However, the quantum algorithms that excite us require billions of operations without an error,” DeSantis added.

DeSantis outlined the challenges in current quantum computing, noting that with a 0.1% error rate, each logical qubit requires thousands of physical qubits. He mentioned that quantum computers are not yet where they need to be to tackle big, complex problems. The potential for improvements through error correction represents the surest bet for more practical quantum computing.

“With a further improvement in physical qubit error rate, we can reduce the overhead of error correction significantly,” he said.

Early Stages

Although DeSantis cautioned that the journey to an error-corrected quantum computer is still in its early stages, he emphasized the importance of this development.

“This step taken is an important part of developing the hardware efficient and scalable quantum error correction that we need to solve interesting problems on a quantum computer,” DeSantis said.

DeSantis hopes this development could accelerate the progress towards practical and reliable quantum computing, potentially revolutionizing industries like pharmaceuticals, materials science, and financial services.

Multiverse Computing used a digital twin and quantum optimization to boost the efficiency of green hydrogen production.

The advance could lead to improving the economics of hydrogen production and reducing a significant source of greenhouse gas.

Multiverse’s partners include IDEA Ingeniería and AMETIC, Spain’s digital industry association.

PRESS RELEASE — Multiverse Computing, a global leader in value-based quantum computing and machine learning solutions, has used a digital twin and quantum optimization to boost the efficiency of green hydrogen production. This work could change the economics of hydrogen production and reduce a significant source of greenhouse gas.

Multiverse’s partners in this work are IDEA Ingeniería, an engineering firm that specializes in renewable projects and digital twins, and AMETIC, Spain’s digital industry association. IDEA developed the digital twin ecosystem for optimizing the generation of green hydrogen. AMETIC is coordinating the overall project.

The quantum digital twin numerically simulates a green hydrogen production plant by using operating parameters of the plant as inputs. By using quantum algorithms to optimize the electrolysis process used for green hydrogen generation, the developed solution achieves a 5% increase in H2 production and associated revenue delivered by the quantum solver compared to the classical solver.

“Electrolysers are currently deployed at a small scale, making hydrogen production costly, so they require significant scale up in an affordable way,” said Enrique Lizaso Olmos, CEO of Multiverse Computing. “This project demonstrates how quantum algorithms can improve the production of green hydrogen to make renewable energy more cost-effective today and in the future.”

Using a classical solver to optimize hydrogen production, the virtual plant delivered 62,579 kg of green H2 and revenue of 154,204 euros. By using quantum-inspired tensor networks with Multiverse’s Singularity, the quantum approach delivered 65,421 kg and revenue of 160,616 euros. This represents a 5% increase in hydrogen production and a 5% increase in revenues produced.

“Green hydrogen will play a significant role in the transition towards a more sustainable and ecological energy landscape,” said Emilio Sanchez, Founder and CEO of IDEA Ingeniería. “The consortium’s continued progress in developing quantum solutions alongside other green technologies can help alleviate the effects of global warming.”

Currently, it’s more expensive to produce green hydrogen than traditional grey hydrogen.1 The traditional method uses electricity—usually generated by coal or natural gas—to separate water into hydrogen and oxygen. Green hydrogen is produced from renewable sources.

About 70 million tons of hydrogen are produced every year and used to refine oil and make ammonia-based fertilizer. The grey hydrogen production process generates between 9 and 12 tons of carbon dioxide for every one ton of hydrogen produced.2 Green hydrogen created from renewable sources is a clean-burning fuel that could reduce emissions from heating and industrial processes such as the production of steel, cement, and fertilizer.

Green hydrogen also could enable more efficient energy storage, as compressed hydrogen tanks can store energy for long periods of time and weigh less than lithium-ion batteries. In addition, it could make the transportation industry greener by decarbonizing shipping, aviation, and trucking.

Multiverse’s future plans for the initiative include increasing the input parameters to create a more realistic quantum digital twin and working with an energy company to validate the digital model, and continue working on the improvement of the quantum solution developed.

A research team from the Vienna University of Technology has demonstrated that due to finite energy or entropy generation, no clock can achieve both perfect resolution and precision simultaneously.

This fundamental limitation impacts the potential capabilities of quantum computers.

This discovery implies natural limits for quantum computers, as the achievable resolution and precision in timekeeping restrict the speed and reliability of quantum computations.

UNIVERSITY RESEARCH NEWS — Vienna University of Technology/November 26, 2023 — There are different ideas about how quantum computers could be built. But they all have one thing in common: you use a quantum physical system — for example, individual atoms — and change their state by exposing them to very specific forces for a specific time. However, this means that in order to be able to rely on the quantum computing operation delivering the correct result, you need a clock that is as precise as possible.

But here you run into problems: perfect time measurement is impossible. Every clock has two fundamental properties: a certain precision and a certain time resolution. The time resolution indicates how small the time intervals are that can be measured — i.e., how quickly the clock ticks. Precision tells you how much inaccuracy you have to expect with every single tick.

The research team was able to show that since no clock has an infinite amount of energy available (or generates an infinite amount of entropy), it can never have perfect resolution and perfect precision at the same time. This sets fundamental limits to the possibilities of quantum computers.

Quantum calculation steps are like rotations

In our classical world, perfect arithmetic operations are not a problem. For example, you can use an abacus in which wooden balls are threaded onto a stick and pushed back and forth. The wooden beads have clear states, each one is in a very specific place, if you don’t do anything the bead will stay exactly where it was.

And whether you move the bead quickly or slowly does not affect the result. But in quantum physics it is more complicated.

“Mathematically speaking, changing a quantum state in a quantum computer corresponds to a rotation in higher dimensions,” says Jake Xuereb from the Atomic Institute at the Vienna University of Technology in the team of Marcus Huber and first author of the first paper published in Physical Review Letters. “In order to achieve the desired state in the end, the rotation must be applied for a very specific period of time. Otherwise, you turn the state either too short or too far.”

Entropy: Time makes everything more and more messy

Marcus Huber and his team investigated in general which laws must always apply to every conceivable clock. “Time measurement always has to do with entropy,” explains Marcus Huber. In every closed physical system, entropy increases and it becomes more and more disordered. It is precisely this development that determines the direction of time: the future is where the entropy is higher, and the past is where the entropy is even lower.

As can be shown, every measurement of time is inevitably associated with an increase in entropy: a clock, for example, needs a battery, the energy of which is ultimately converted into frictional heat and audible ticking via the clock’s mechanics — a process in which a fairly ordered state occurs the battery is converted into a rather disordered state of heat radiation and sound.

On this basis, the research team was able to create a mathematical model that basically every conceivable clock must obey. “For a given increase in entropy, there is a tradeoff between time resolution and precision,” says Florian Meier, first author of the second paper, now posted to the arXiv preprint server. “That means: Either the clock works quickly or it works precisely — both are not possible at the same time.”

Limits for quantum computers

This realization now brings with it a natural limit for quantum computers: the resolution and precision that can be achieved with clocks limits the speed and reliability that can be achieved with quantum computers. “It’s not a problem at the moment,” says Huber.

“Currently, the accuracy of quantum computers is still limited by other factors, for example, the precision of the components used or electromagnetic fields. But our calculations also show that today we are not far from the regime in which the fundamental limits of time measurement play the decisive role.”

Therefore, if the technology of quantum information processing is further improved, one will inevitably have to contend with the problem of non-optimal time measurement. But who knows: Maybe this is exactly how we can learn something interesting about the quantum world.

Featured image: The oversampling regime of an exemplary clock — a pendulum in a weakly lit environment. The two sources of entropy production for this clock are: the friction within the clockwork itself, and the matter–light interaction necessary to track the position of the pendulum. The plot shows the elementary ticking events of this clock as a function of time, i.e., the photons reflected off the pendulum when it is close to its maximum deflection. In the oversampling regime, the average time between two such ticks is much shorter than that of the period of the TPC (continuous line), which in the case of this pendulum is 2 s. Due to technical limitations, one does not count photons, but rather the TPC cycles through the averaged light intensity. Credit: arXiv (2023). DOI: 10.48550/arxiv.2301.05173

BosonQ Psi’s Quantum-Inspired Design Optimization (QIDO) Solver has been validated as an effective solution for topology optimization in the aerospace and automotive industries, overcoming challenges faced by classical topology optimization methods in large-scale design problems. The study involved using the QIDO Solver to optimize airfoil structures, demonstrating its ability to efficiently handle complex design problems, such as weight minimization.

QIDO’s quantum-inspired approach, utilizing principles like superposition and entanglement, allows for simultaneous searching of larger solution spaces, resulting in better optimization than classical methods. This technology reduces the number of iterations and computing resources needed for topology optimization, offering more accurate and cost-effective solutions for airfoil structures in aircraft and automobiles.

The research highlighted the potential of QIDO Solver to dramatically improve aircraft and automobile performance and safety. Traditional topology optimization problems are typically solved using finite element analysis, but the QIDO Solver can handle complex design problems, such as minimization of the total weight of the structure, and finds global minima for obtaining optimal airfoil designs. This has implications for reducing manufacturing costs and enhancing efficiency in advanced aircraft and automobile airfoil structures.

RESEARCH NEWS— Buffalo, NY/November 15, 2023 — Design optimization finds the optimal material layout of a given structure by rearranging the material within the domain. It is classified into size, shape, and topology optimization based on the problem’s complexity. Topology optimization plays a significant role in achieving safer and more efficient designs for the aerospace and automotive industries. Different aircraft and automobile wing structures can be obtained with next-generation additive manufacturing technologies, departing from traditional rib-spar wing constructions. However, traditional topology optimization methods need to be revised when applied to aerospace structures due to their large-scale design problems.

This article will discuss the topology optimization capabilities of the Quantum-Inspired Design Optimization (QIDO) Solver, its advantages over classical methods, and the future roadmap for maximizing efficiency in advanced aircraft and automobile airfoil structures.

Figure 1: Schematic Airfoil section internal domain as design space, the outer skin as non-design space, and the wing supports are fixed.

Current Bottlenecks with the classical topology optimization techniques in Engineering Optimization:

The shape and weight of an airfoil plays a significant role in aircraft performance and safety. Topology optimization has become a priority within the aerospace and automotive industries to achieve safer and more efficient designs while reducing weight. However, computational challenges arise when dealing with high aspect-ratio wings, which require conventional density-based topology optimization methods to discretize the problem domain uniformly.

Figure 1 shows design space in blue, which is required to be discretized in the above optimization method. The complex geometry and boundary conditions turn the problem into a large-scale design optimization problem. Similarly, high aspect ratio domains of wings in aircraft or automobiles create more complex and harder-to-model design spaces [1, 2]. This limits the effectiveness of traditional classical optimization algorithms and classical computers that need an advanced solution.

Another limitation of the classical approach is that it reaches local minima instead of global minima, indicating that more efficient designs could be explored and exploited within the design process [3]. Additionally, classical optimization methods require more iterations to get optimal results for a given airfoil design, which demands more computing resources, such as GPUs and CPUs. Classical algorithms on classical computers demand more efficiency regarding the computing resources required while still delivering accuracy in topology optimization tasks.

Figure 2: The figure illustrates how, in the real world, the origin of aerodynamic forces on an airfoil section arises from the combined effects of pressure distributions and shear stress on the boundary layer.

Quantum-Inspired Approach in Design Optimization:

The Quantum-Inspired approach utilizes the principles of quantum computing, such as interference, superposition, and entanglement, to process information. By emulating these principles, the Quantum-Inspired approach allows for simultaneous searching of a larger solution space, leading to better-optimized results over classical solutions, faster convergence speed, and minimizing the usage of computing resources.

BosonQ Psi’s QIDO Solver is a Quantum-Inspired Design Optimization solver that maximizes efficiency in design engineering. QIDO’s ability to search the global optima sets it apart from traditional optimization techniques, resulting in better airfoil designs with higher performance and efficiency. The QIDO solver also significantly reduces the number of iterations required to converge to the optimal design, saving substantial simulation time. Moreover, by harnessing the power of quantum algorithms, the QIDO Solver optimizes the design using fewer computing resources, enhancing the cost-effectiveness of the design optimization process.

In the context of volume minimization of airfoil structures, the QIDO solver brings a different optimization landscape than classical methods. The low volume fraction of aerospace and automobile structures and the considerations of slenderness, buckling, and strength contribute to the complexity of optimizing low-weight, high-performance airfoil designs. By focusing on topology optimization methods, QIDO removes materials from unintended structures, meeting the demands for low-volume fraction aerospace structures, which increases the efficiency of the component.

Traditional topology optimization problems are typically solved using finite element (FE) analysis, treating each element’s presence as a design variable and aiming to find the optimal distribution of elements in the design domain [4]. This approach formulates the problem with continuous design variables, where design variables take values from 0 to 1. They produce optimal designs with fictitious elements and no clear boundary for fabricating them [4, 5].

Previous research has demonstrated that efficient topology optimization techniques can significantly enhance aircraft performance and safety. For example, Airbus’s method successfully reduced the weight of A380 components such as wingbox ribs by 10%, leading to increased stability, safety, and a 42% reduction in drag. These advancements in topology optimization have also led to cost reductions for aircraft manufacturing companies. However, for a middle-sized topology optimization problem on flexible wing structures, the number of design variables can reach up to approximately 70,000 to 100,000, making these problems incredibly complex for traditional optimization methods [2, 6].

With the QIDO (Quantum-Inspired Design Optimization) Solver from BosonQ Psi, topology optimization achieves highly optimized results for internal aircraft wing structures, improving efficiency and reducing manufacturing costs. The solver can handle complex design problems, such as minimization of the total weight of the structure, and finds global minima for obtaining optimal airfoil designs.

Figure 3: Optimal design of airfoil obtained using BQPhy’s QIDO solver

BQPhy’s topology optimization results for airfoil wings using QIDO have demonstrated remarkable outcomes. By considering the outer skin as a non-design domain, the weight of the airfoil structure can be reduced to 60% [refer to Figure 3] of its initial solid volume while maintaining its structural integrity.

Conclusion:

The QIDO presents a revolutionary approach to weight minimization in the design of airfoil structures. QIDO harnesses the principles of quantum computing and integrates them into the optimization process. This nascent methodology enables engineers to reach global minima, significantly reduces the number of iterations required, and optimizes designs using fewer computing resources. These advancements improve efficiency, reduce manufacturing costs, and the possibility of pushing the boundaries of performance and innovation in advanced aircraft and automobile airfoil structures. With QIDO, the goal of achieving safer, more efficient, and lighter designs becomes within reach for companies in the aerospace and automotive industries.

List of references:

1. Zhu, Ji-Hong, Wei-Hong Zhang, and Liang Xia. “Topology optimization in aircraft and aerospace structures design.” Archives of computational methods in engineering 23 (2016): 595–622.

2. Luis Félix, Alexandra A. Gomes2, and Afzal Suleman. “Wing Topology Optimization with Self-Weight Loading” iWorld Congress on Structural and Multidisciplinary Optimization May19, -24, 2013, Orlando,Florida, USA.

3. Stanford, Bret, and Peter Ifju. “Multi-objective topology optimization of wing skeletons for aeroelastic membrane structures.” International Journal of Micro Air Vehicles 1.1 (2009): 51–69.

4. Høghøj, Lukas C., et al. “Simultaneous shape and topology optimization of wings.” Structural and Multidisciplinary Optimization 66.5 (2023): 116.

5. Gomes, Pedro, and Rafael Palacios. “Aerodynamic-driven topology optimization of compliant airfoils.” Structural and Multidisciplinary Optimization 62 (2020): 2117–2130.

6. James, Kai. Aerostructural shape and topology optimization of aircraft wings. University of Toronto (Canada), 2012.

Alibaba Group donates quantum computing equipment to Zhejiang University.

The news comes after reports that DAMO Academy close its quantum lab.

The move to half quantum operations appears to have been abrupt because the company was recruiting quantum experts just four months ago.

Alibaba Group’s DAMO Academy, the company’s deep tech research since its inception by former CEO Jack Ma in 2017, has chosen to contribute its quantum computing resources to the academic sphere, donating its laboratory and equipment to Zhejiang University, local Chinese sources are reporting.

Zhejiang University is home to a well-respected quantum information group that investigates several quantum computing approaches and architectures.

The move also suggests that Alibaba’s quantum efforts will not be absorbed by other units within the company, but will be completely scrapped.

According to the media outlet, Caixin, the decision aligns with Alibaba’s commitment to academic collaboration, providing Zhejiang University, along with other institutions, access to cutting-edge tools to continue quantum research.

The transition occurs shortly after layoffs were reported at the Quantum Lab, affecting over 30 employees amidst budget and profitability revisions.

The outlet reported that the closure of the lab was unexpected. The DAMO Academy had continued to recruit for quantum computing roles into July, suggesting the abruptness of the decision.

Alibaba’s move reflects a broader trend in the tech industry, particularly in the deep tech industry, where commercial entities often partner with academic institutions to advance scientific research.

According to The Quantum Insider’s China’s Quantum Computing Market brief, Alibaba is a diverse tech conglomerate that has been active in quantum since 2015. The company’s Quantum Lab Academy teaching employees and students about the prospects of quantum computing. Alibaba’s Quantum Laboratory is a full-stack R&D service offering an 11-qubit quantum cloud platform. According to some reports, Alibaba invested about $15 billion into emerging technologies such as quantum.

PASQAL announced the launch of a $90 million quantum technology initiative over five years in Sherbrooke, Quebec.

The project includes quantum computer manufacturing and commercialization activities, as well as research and development.

Officials expect the creation of 53 jobs.

PRESS RELEASE — PASQAL, a leader in the development of neutral-atom quantum computers, announced the launch of a $90 million quantum technology initiative over five years in Sherbrooke, Quebec. The project aims to conduct manufacturing and commercialization activities for quantum computers, as well as research and development in collaboration with academic and industrial partners in quantum computing within DistriQ, a quantum innovation zone. The goal of this innovation zone is to establish Sherbrooke as an internationally renowned quantum hub. The Government of Quebec is providing a $15 million loan in connection with this investment project for the establishment of PASQAL SAS’s subsidiary in the quantum innovation zone, DISTRIQ, based in Sherbrooke. Moreover, the project is expected to create 53 permanent jobs over the course of five years.

Inauguration of Espace Quantique 1: A New Era for Quantum Computing

On November 24, during an official ceremony, the Premier of Quebec, François Legault, officially announced the opening of Espace Quantique 1 alongside the Minister of Economy, Innovation, and Energy, and the Minister responsible for Regional Economic Development and the Minister for the Metropolis and the Montreal Region, Mr. Pierre Fitzgibbon. The CEO of PASQAL, Georges-Olivier Reymond, Chief Technical Officer Loïc Henriet, co-founders Christophe Jurczak and Nobel Prize laureate Alain Aspect, were also present.

Strategic Collaboration between PASQAL and Investissement Québec

PASQAL will play a key role in this initiative, not only as a major partner of DistriQ within Espace Quantique 1, but also in the production, development of technological laboratories, training, and funding for new ventures in the quantum field. The initiative stands as one of the most ambitious endeavors in North America within the field of quantum computing.

An Ambitious Initiative for the Future of Quantum in North America

PASQAL’s presence in Sherbrooke represents a major step in the evolution of quantum computing. “Thanks to this unprecedented collaboration between the private and public sectors, we are creating an environment leading to major technological advancements, especially in terms of sustainable development,” emphasizes Georges-Olivier Reymond, CEO of PASQAL. “We aim to actively participate in the creation of a dynamic ecosystem that will serve as a catalyst for innovation in the quantum industry, while attracting talent and companies from all over the world.”

Investments in Infrastructure and Innovation: The Factory and Espace Quantique 1

In 2024, PASQAL will open a facility at the heart of DistriQ, within Espace Quantique 1, aimed at manufacturing neutral atom quantum computers and the next generation of machines. Quantum Space 1 will also provide a collaborative space of nearly 5,000 square meters dedicated to quantum innovation. Equipped with advanced quantum computers, it will be utilized, among other purposes, by PASQAL as an R&D center, for prototype testing, and for business activities in Canada.

Training and Talent Attraction: PASQAL’s Commitment to Education

DistriQ also focuses on training talent. In this context, PASQAL announced a contribution of $500,000 to the creation of a research chair within the Department of Electrical and Computer Engineering at the University of Sherbrooke, which will also benefit from federal and/or local grants.

Support for Startups: The DistriQ Ecosystem and Its Partners

Quantonation, and the Quebec fund Quantacet will collaborate to fund QV Studio, that will support the transition to commercial quantum applications, creating a unique ecosystem within DistriQ for sector startups. This fund aims to invest in around fifteen Quebec-based or foreign companies, especially at the pre-seed or seed stage, that are active within the DistriQ innovation zone. It will foster the development of a strong and internationally competitive Quebec ecosystem in this future-oriented sector.”

Christophe Jurczak, CEO of Quantonation and co-founder of PASQAL, states: “Espace Quantique 1 will become a leading center of innovation, facilitating the transition of quantum startups from concept to commercialization and forming a dynamic community around quantum technologies.”

Calculations show that there are fundamental limits to quantum computing – namely the quality of the clock used.

Scientists showed that since no clock has an infinite amount of energy available, it can never have perfect resolution and perfect precision at the same time.

Researchers from the Atomic Institute at the Vienna University of Technology led the study.

Image: Vienna University of Technology

PRESS RELEASE — There are different ideas about how quantum computers could be built. But they all have one thing in common: you use a quantum physical system – for example individual atoms – and change their state by exposing them to very specific forces for a specific time. However, this means that in order to be able to rely on the quantum computing operation delivering the correct result, you need a clock that is as precise as possible.

But here you run into problems: perfect time measurement is impossible. Every clock has two fundamental properties: a certain precision and a certain time resolution. The time resolution indicates how small the time intervals are that can be measured – i.e. how quickly the clock ticks. Precision tells you how much inaccuracy you have to expect with every single tick.

The research team was able to show that since no clock has an infinite amount of energy available (or generates an infinite amount of entropy), it can never have perfect resolution and perfect precision at the same time. This sets fundamental limits to the possibilities of quantum computers.

Quantum calculation steps are like rotations

In our classical world, perfect arithmetic operations are not a problem. For example, you can use an abacus in which wooden balls are threaded onto a stick and pushed back and forth. The wooden beads have clear states, each one is in a very specific place, if you don’t do anything the bead will stay exactly where it was.
And whether you move the bead quickly or slowly does not affect the result. But in quantum physics it is more complicated.

“Mathematically speaking, changing a quantum state in a quantum computer corresponds to a rotation in higher dimensions,” says Jake Xuereb from the Atomic Institute at the Vienna University of Technology in the team of Marcus Huber and first author of the first paper. “In order to achieve the desired state in the end, the rotation must be applied for a very specific period of time. Otherwise you turn the state either too short or too far.”

Entropy: Time makes everything more and more messy

Marcus Huber and his team investigated in general which laws must always apply to every conceivable clock. “Time measurement always has to do with entropy,” explains Marcus Huber. In every closed physical system, entropy increases and it becomes more and more disordered. It is precisely this development that determines the direction of time: the future is where the entropy is higher, the past is where the entropy was even lower.

As can be shown, every measurement of time is inevitably associated with an increase in entropy: a clock, for example, needs a battery, the energy of which is ultimately converted into frictional heat and audible ticking via the clock’s mechanics – a process in which a fairly ordered state occurs the battery is converted into a rather disordered state of heat radiation and sound.

On this basis, the research team was able to create a mathematical model that basically every conceivable clock must obey. “For a given increase in entropy, there is a tradeoff between time resolution and precision,” says Florian Meier, first author of the second paper. “That means: Either the clock works quickly or it works precisely – both are not possible at the same time.”

Limits for quantum computers

This realization now brings with it a natural limit for quantum computers: the resolution and precision that can be achieved with clocks limits the speed and reliability that can be achieved with quantum computers. “It’s not a problem at the moment,” says Marcus Huber. “Currently, the accuracy of quantum computers is still limited by other factors, for example the precision of the components used or electromagnetic fields. But our calculations also show that today we are not far from the regime in which the fundamental limits of time measurement play the decisive role.”

Therefore, if the technology of quantum information processing is further improved, one will inevitably have to contend with the problem of non-optimal time measurement. But who knows: Maybe this is exactly how we can learn something interesting about the quantum world.

SQE announced it will collaborate with Quantum Blockchains.

The partnership leverages SQE’s expertise in quantum security technologies with Quantum Blockchains’ specialized knowledge of blockchain security and advancing quantum cryptography.

Dr. Mirek Sopek, CEO of Quantum Blockchains, will also join SQE as a Scientific Advisor.

PRESS RELEASE — SQE, a revolutionary, quantum-secure blockchain platform powered by patent-pending technology, is pleased to announce its collaboration with Quantum Blockchains, an innovative European startup dedicated to enhancing blockchain security and advancing quantum cryptography.

The companies aim to leverage SQE’s expertise in quantum security technologies powered by Simulated Quantum Entanglement and Quantum Blockchains’ specialized knowledge of systems based on Quantum Key Distribution, Quantum Random Number Generators and Post-Quantum Cryptography to explore opportunities to further develop their respective technologies. Additionally, Dr. Mirek Sopek, CEO of Quantum Blockchains, will join SQE as a Scientific Advisor.

“Dr. Sopek is a recognized expert in quantum blockchain, quantum security, quantum key distribution and an authority in quantum computing. His knowledge will be invaluable in standardizing our technology to NIST standards, as well as in further developing our state-of-the-art platform,” said Hamid Pishdadian, SQE’s CEO and founder.

“SQE Holdings, led by renowned American inventor Hamid Pishdadian, holder of numerous United States Patents, is currently pioneering the development of a visionary blockchain technology based on simulated quantum entanglement. In a significant collaboration, Quantum Blockchains, our startup, sees an invaluable opportunity to rigorously test our methodology, which relies on Quantum Key Distribution (QKD), Post-Quantum Cryptography (PQC), and Quantum Random Number Generation (QRNG) technologies. This partnership allows us to benchmark our approach against SQE’s simulated entanglement technology,” said Dr. Mirek Sopek, CEO and founder of Quantum Blockchains.

The collaboration between these two companies and the shared strength of their technologies creates incredible innovation potential in the development of a quantum-secured blockchain system. SQE and Quantum Blockchains are excited to advance their cooperative efforts as they explore and develop these novel technologies.

Q-CTRL announced that its Q-CTRL Embedded software has been integrated as an option with IBM Quantum’s Pay-As- You-Go Plan.

The integration aims to provide user-friendly functionality to address unreliable results on hardware.

Q-CTRL’s software automatically addressing the problem of noise and hardware error.

PRESS RELEASE — Q-CTRL, a global leader in developing useful quantum technologies through quantum control infrastructure software, today announced that its Q-CTRL Embedded software has been integrated as an option with IBM Quantum’s Pay-As- You-Go Plan to deliver advancements in quantum computing utility and performance. This integration represents the first time a third-party independent software vendor’s technology solution will be available for users to select in the IBM Quantum Pay-As-You-Go Plan.

The integration aims to provide user-friendly functionality to address the primary challenge facing quantum computing end-users: Unreliable results from algorithms run on today’s hardware.

To get the most out of near-term quantum computers you need to be an expert in an array of technical specializations – algorithms, compilers, error suppression strategies, and error mitigation – without focusing on each of these it’s difficult to get reliable results. The combination of Q-CTRL technology and IBM Quantum services reduces this burden, making it simpler to get useful results from real hardware by automatically addressing the problem of noise and hardware error.

Companies and end-users are seeking streamlined ways to integrate useful quantum computing into their workflows and to better leverage their existing IT expertise. Q-CTRL’s state-of-the-art performance-management infrastructure software, Q-CTRL Embedded, delivers these benefits to users and will now be available as an option within the IBM Quantum Pay-As-You-Go Plan.

Now, any IBM Quantum Pay-As-You-Go Plan user has the option to utilize Q-CTRL’s advanced technology using a single command within their Qiskit environment. And in great news for the community, accessing Q-CTRL’s performance-management software incurs no additional costs to the IBM Quantum Pay-As-You-Go Plan.

“Since we joined the IBM Quantum Network in 2018, we’ve been building the world’s most advanced infrastructure software for performance management in quantum computing,” said Q-CTRL CEO and Founder Michael J. Biercuk. “IBM has built a world-class quantum computing platform with the flexibility needed for experts like Q-CTRL to demonstrate new software able to dramatically improve the success of real quantum algorithms—detailed tests on a suite of benchmarking algorithms showed benefits up to thousands of times. We’re very excited to now bring these tools to the exceptional ecosystem of researchers and businesses building their quantum workflows on IBM hardware.”

TRL Embedded delivers enhancements in computational accuracy and efficiency through a simple configuration-free setting. When the performance management option is selected, a fully configured autonomous toolchain is triggered in the background to suppress

Based on recently peer-reviewed research on this topic and new tests on utility-scale quantum systems, benefits can reach up to:

10X increase in the complexity of quantum algorithms they can run (measured through circuit depth), up to intrinsic hardware limits;

100X cost reduction relative to alternative research-grade error-reduction strategies by

reducing the number of experimental “shots” required to suppress errors;

>1,000X improvement in the success of quantum algorithms widely used in the

These functionalities, in combination with the IBM Quantum development roadmap, aim to accelerate the path toward quantum advantage and allow end users from research to enterprise to gain strategic advantages they’ve been seeking from their quantum applications.

“At IBM, our goal is to give our users the ability to run valuable quantum workloads beyond what can be simulated on classical computers. A core requirement to this is reducing noise. The noise suppression provided through Q-CTRL’s performance management makes exploring useful quantum circuits even easier. I very much look forward to what our users will be able to do with this newly added error-suppression technology,” said Jay Gambetta, IBM Fellow and Vice President, IBM Quantum.

QuantumDiamonds is a Munich, Germany based company formed in 2022 by graduates of the Technical University of Munich. The first €3 million of the funding was led by funding led by IQ Capital and Earlybird with additional participation from Onsight Ventures, First Momentum, Creator Fund, UnternehmerTUM, and various angel investors from the semiconductor industry. In [...]

Oxford Quantum Circuits is a quantum processor manufacturer using superconducting technology and formed in 2017 and based in the UK. The $100 million Series B investment was led by SBI Investment with additional participation from previous investors Oxford Science Enterprises (OSE), University of Tokyo Edge Capital Partners (UTEC), Lansdowne Partners, and OTIF, acted by manager Oxford [...]

As part of a multi-client study on the technology used in the middle part of the quantum computing stack, GQI Simon Fried from Classiq who shared his view on how Classiq is democratizing quantum via their platform: Quantum Computing Midstack is an essential bridge between hardware and software, QPUs, and end-user requirements and plays a [...]

As part of a multi-client study on the technology used in the middle part of the quantum computing stack, GQI talked to Asif Sinay, CEO of QEDMA who shared his vision on why they started the company and what are their goals to foster growth in the Quantum Computing ecosystem via their unique positioning in [...]

As part of a multi-client study on the technology used in the middle part of the quantum computing stack, GQI interviewed Vishal Chatrath, CEO of QuantrolOx about the role of the Quantum Computing Midstack in the quantum ecosystem and how his company, QuantrolOx plans to address the most critical issues in the sector. Here are his comments: At QuantrolOx, [...]