Normal view

There are new articles available, click to refresh the page.
Today — 30 November 2023Quantum Computers
Yesterday — 29 November 2023Quantum Computers
Before yesterdayQuantum Computers

What Is Quantum Advantage? The Moment Extremely Powerful Quantum Computers Will Arrive

Quantum advantage is the milestone the field of quantum computing is fervently working toward, when a quantum computer can solve problems that are beyond the reach of the most powerful non-quantum, or classical, computers.

Quantum refers to the scale of atoms and molecules where the laws of physics as we experience them break down and a different, counterintuitive set of laws apply. Quantum computers take advantage of these strange behaviors to solve problems.

There are some types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithms. Research in recent decades has shown that quantum computers have the potential to solve some of these problems. If a quantum computer can be built that actually does solve one of these problems, it will have demonstrated quantum advantage.

I am a physicist who studies quantum information processing and the control of quantum systems. I believe that this frontier of scientific and technological innovation not only promises groundbreaking advances in computation but also represents a broader surge in quantum technology, including significant advancements in quantum cryptography and quantum sensing.

The Source of Quantum Computing’s Power

Central to quantum computing is the quantum bit, or qubit. Unlike classical bits, which can only be in states of 0 or 1, a qubit can be in any state that is some combination of 0 and 1. This state of neither just 1 or just 0 is known as a quantum superposition. With every additional qubit, the number of states that can be represented by the qubits doubles.

This property is often mistaken for the source of the power of quantum computing. Instead, it comes down to an intricate interplay of superposition, interference , and entanglement.

Interference involves manipulating qubits so that their states combine constructively during computations to amplify correct solutions and destructively to suppress the wrong answers. Constructive interference is what happens when the peaks of two waves—like sound waves or ocean waves—combine to create a higher peak. Destructive interference is what happens when a wave peak and a wave trough combine and cancel each other out. Quantum algorithms, which are few and difficult to devise, set up a sequence of interference patterns that yield the correct answer to a problem.

Entanglement establishes a uniquely quantum correlation between qubits: The state of one cannot be described independently of the others, no matter how far apart the qubits are. This is what Albert Einstein famously dismissed as “spooky action at a distance.” Entanglement’s collective behavior, orchestrated through a quantum computer, enables computational speed-ups that are beyond the reach of classical computers.

Applications of Quantum Computing

Quantum computing has a range of potential uses where it can outperform classical computers. In cryptography, quantum computers pose both an opportunity and a challenge. Most famously, they have the potential to decipher current encryption algorithms, such as the widely used RSA scheme.

One consequence of this is that today’s encryption protocols need to be reengineered to be resistant to future quantum attacks. This recognition has led to the burgeoning field of post-quantum cryptography. After a long process, the National Institute of Standards and Technology recently selected four quantum-resistant algorithms and has begun the process of readying them so that organizations around the world can use them in their encryption technology.

In addition, quantum computing can dramatically speed up quantum simulation: the ability to predict the outcome of experiments operating in the quantum realm. Famed physicist Richard Feynman envisioned this possibility more than 40 years ago. Quantum simulation offers the potential for considerable advancements in chemistry and materials science, aiding in areas such as the intricate modeling of molecular structures for drug discovery and enabling the discovery or creation of materials with novel properties.

Another use of quantum information technology is quantum sensing: detecting and measuring physical properties like electromagnetic energy, gravity, pressure, and temperature with greater sensitivity and precision than non-quantum instruments. Quantum sensing has myriad applications in fields such as environmental monitoring, geological exploration, medical imaging, and surveillance.

Initiatives such as the development of a quantum internet that interconnects quantum computers are crucial steps toward bridging the quantum and classical computing worlds. This network could be secured using quantum cryptographic protocols such as quantum key distribution, which enables ultra-secure communication channels that are protected against computational attacks—including those using quantum computers.

Despite a growing application suite for quantum computing, developing new algorithms that make full use of the quantum advantage—in particular in machine learning—remains a critical area of ongoing research.

a metal apparatus with green laser light in the background
A prototype quantum sensor developed by MIT researchers can detect any frequency of electromagnetic waves. Image Credit: Guoqing Wang, CC BY-NC-ND

Staying Coherent and Overcoming Errors

The quantum computing field faces significant hurdles in hardware and software development. Quantum computers are highly sensitive to any unintentional interactions with their environments. This leads to the phenomenon of decoherence, where qubits rapidly degrade to the 0 or 1 states of classical bits.

Building large-scale quantum computing systems capable of delivering on the promise of quantum speed-ups requires overcoming decoherence. The key is developing effective methods of suppressing and correcting quantum errors, an area my own research is focused on.

In navigating these challenges, numerous quantum hardware and software startups have emerged alongside well-established technology industry players like Google and IBM. This industry interest, combined with significant investment from governments worldwide, underscores a collective recognition of quantum technology’s transformative potential. These initiatives foster a rich ecosystem where academia and industry collaborate, accelerating progress in the field.

Quantum Advantage Coming Into View

Quantum computing may one day be as disruptive as the arrival of generative AI. Currently, the development of quantum computing technology is at a crucial juncture. On the one hand, the field has already shown early signs of having achieved a narrowly specialized quantum advantage. Researchers at Google and later a team of researchers in China demonstrated quantum advantage for generating a list of random numbers with certain properties. My research team demonstrated a quantum speed-up for a random number guessing game.

On the other hand, there is a tangible risk of entering a “quantum winter,” a period of reduced investment if practical results fail to materialize in the near term.

While the technology industry is working to deliver quantum advantage in products and services in the near term, academic research remains focused on investigating the fundamental principles underpinning this new science and technology. This ongoing basic research, fueled by enthusiastic cadres of new and bright students of the type I encounter almost every day, ensures that the field will continue to progress.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: xx / xx

What a '2D' quantum superfluid feels like to the touch

2 November 2023 at 17:51
Researchers have discovered how superfluid helium 3He would feel if you could put your hand into it. The interface between the exotic world of quantum physics and classical physics of the human experience is one of the major open problems in modern physics. Nobody has been able to answer this question during the 100-year history of quantum physics.

Optical-fiber based single-photon light source at room temperature for next-generation quantum processing

2 November 2023 at 17:51
Single-photon emitters quantum mechanically connect quantum bits (or qubits) between nodes in quantum networks. They are typically made by embedding rare-earth elements in optical fibers at extremely low temperatures. Now, researchers have developed an ytterbium-doped optical fiber at room temperature. By avoiding the need for expensive cooling solutions, the proposed method offers a cost-effective platform for photonic quantum applications.

A Revolution in Computer Graphics Is Bringing 3D Reality Capture to the Masses

As a weapon of war, destroying cultural heritage sites is a common method by armed invaders to deprive a community of their distinct identity. It was no surprise then, in February of 2022, as Russian troops swept into Ukraine, that historians and cultural heritage specialists braced for the coming destruction. So far in the Russia-Ukraine War, UNESCO has confirmed damage to hundreds of religious and historical buildings and dozens of public monuments, libraries, and museums.

While new technologies like low-cost drones, 3D printing, and private satellite internet may be creating a distinctly 21st century battlefield unfamiliar to conventional armies, another set of technologies is creating new possibilities for citizen archivists off the frontlines to preserve Ukrainian heritage sites.

Backup Ukraine, a collaborative project between the Danish UNESCO National Commission and Polycam, a 3D creation tool, enables anyone equipped with only a phone to scan and capture high-quality, detailed, and photorealistic 3D models of heritage sites, something only possible with expensive and burdensome equipment just a few years ago.

Backup Ukraine is a notable expression of the stunning speed with which 3D capture and graphics technologies are progressing, according to Bilawal Sidhu, a technologist, angel investor, and former Google product manager who worked on 3D maps and AR/VR.

“Reality capture technologies are on a staggering exponential curve of democratization,” he explained to me in an interview for Singularity Hub.

According to Sidhu, generating 3D assets had been possible, but only with expensive tools like DSLR cameras, lidar scanners, and pricey software licenses. As an example, he cited the work of CyArk, a non-profit founded two decades ago with the aim of using professional grade 3D capture technology to preserve cultural heritage around the world.

“What is insane, and what has changed, is today I can do all of that with the iPhone in your pocket,” he says.

In our discussion, Sidhu laid out three distinct yet interrelated technology trends that are driving this progress. First is a drop in cost of the kinds of cameras and sensors which can capture an object or space. Second is a cascade of new techniques which make use of artificial intelligence to construct finished 3D assets. And third is the proliferation of computing power, largely driven by GPUs, capable of rendering graphics-intensive objects on devices widely available to consumers.

Lidar scanners are an example of the price-performance improvement in sensors. First popularized as the bulky spinning sensors on top of autonomous vehicles, and priced in the tens of thousands of dollars, lidar made its consumer-tech debut on the iPhone 12 Pro and Pro Max in 2020. The ability to scan a space in the same way driverless cars see the world meant that suddenly anyone could quickly and cheaply generate detailed 3D assets. This, however, was still only available to the wealthiest Apple customers.

Day 254: hiking in Pinnacles National Park and scanning my daughter as we crossed a small dry creek.

Captured with the iPhone 12 Pro + @Scenario3d. I can’t wait to see these 3D memories 10 years from now.

On @Sketchfab: https://t.co/mvxtOMhzS5#1scanaday #3Dscanning #XR pic.twitter.com/9DX1Ltnmh8

— Emm (@emmanuel_2m) September 14, 2021

One of the industry’s most consequential turning points occurred that same year when researchers at Google introduced neural radiance fields, commonly referred to as NeRFs.

This approach uses machine learning to construct a credible 3D model of an object or space from 2D pictures or video. The neural network “hallucinates” how a full 3D scene would appear, according to Sidhu. It’s a solution to “view synthesis,” a computer graphics challenge seeking to allow someone to see a space from any point of view from only a few source images.

“So that thing came out and everyone realized we’ve now got state-of-the-art view synthesis that works brilliantly for all the stuff photogrammetry has had a hard time with like transparency, translucency, and reflectivity. This is kind of crazy,” he adds.

The computer vision community channeled their excitement into commercial applications. At Google, Sidhu and his team explored using the technology for Immersive View, a 3D version of Google Maps. For the average user, the spread of consumer-friendly applications like Luma AI and others meant that anyone with just a smartphone camera could make photorealistic 3D assets. The creation of high-quality 3D content was no longer limited to Apple’s lidar-elite.

Now, another potentially even more promising method of solving view synthesis is earning attention rivaling that early NeRF excitement. Gaussian splatting is a rendering technique that mimics the way triangles are used for traditional 3D assets, but instead of triangles, it’s a “splat” of color expressed through a mathematical function known as a gaussian. As more gaussians are layered together, a highly detailed and textured 3D asset becomes visible.The speed of adoption for splatting is stunning to watch.

It’s only been a few months but demos are flooding X, and both Luma AI and Polycam are offering tools to generate gaussian splats. Other developers are already working on ways of integrating them into traditional game engines like Unity and Unreal. Splats are also gaining attention from the traditional computer graphics industry since their rendering speed is faster than NeRFs, and they can be edited in ways already familiar to 3D artists. (NeRFs don’t allow this given they’re generated by an indecipherable neural net.)

For a great explanation for how gaussian splatting works and why it’s generating buzz, see this video from Sidhu.

Regardless of the details, for consumers, we are decidedly in a moment where a phone can generate Hollywood-caliber 3D assets that not long ago only well-equipped production teams could produce.

But why does 3D creation even matter at all?

To appreciate the shift toward 3D content, it’s worth noting the technology landscape is orienting toward a future of “spatial computing.” While overused terms like the metaverse might draw eye rolls, the underlying spirit is a recognition that 3D environments, like those used in video games, virtual worlds, and digital twins have a big role to play in our future. 3D assets like the ones produced by NeRFs and splatting are poised to become the content we’ll engage with in the future.

Within this context, a large-scale ambition is the hope for a real-time 3D map of the world. While tools for generating static 3D maps have been available, the challenge remains finding ways of keeping those maps current with an ever-changing world.

“There’s the building of the model of the world, and then there’s maintaining that model of the world. With these methods we’re talking about, I think we might finally have the tech to solve the ‘maintaining the model’ problem through crowdsourcing,” says Sidhu.

Projects like Google’s Immersive View are good early examples of the consumer implications of this. While he wouldn’t speculate when it might eventually be possible, Sidhu agreed that at some point, the technology will exist which would allow a user in VR to walk around anywhere on Earth with a real-time, immersive experience of what is happening there. This type of technology will also spill into efforts in avatar-based “teleportation,” remote meetings, and other social gatherings.

Another reason to be excited, says Sidhu, is 3D memory capture. Apple, for example, is leaning heavily into 3D photo and video for their Vision Pro mixed reality headset. As an example, Sidhu told me he recently created a high-quality replica of his parents’ house before they moved out. He could then give them the experience of walking inside of it using virtual reality.

“Having that visceral feeling of being back there is so powerful. This is why I’m so bullish on Apple, because if they nail this 3D media format, that’s where things can get exciting for regular people.”

i’m convinced the killer use case for 3d reconstruction tech is memory capture

my parents retired earlier this year and i have immortalized their home forever more

photo scanning is legit the most future proof medium we have access to today

scan all the spaces/places/things pic.twitter.com/kmqX5FYaN6

— Bilawal Sidhu (@bilawalsidhu) November 3, 2023

From cave art to oil paintings, the impulse to preserve aspects of our sensory experience is deeply human. Just as photography once muscled in on still lifes as a means of preservation, 3D creation tools seem poised to displace our long-standing affair with 2D images and video.

Yet just as photography can only ever hope to capture a fraction of a moment in time, 3D models can’t fully replace our relationship to the physical world. Still, for those experiencing the horrors of war in Ukraine, perhaps these are welcome developments offering a more immersive way to preserve what can never truly be replaced.

Image Credit: Polycam

Late not great -- imperfect timekeeping places significant limit on quantum computers

30 October 2023 at 23:45
Quantum physicists show that imperfect timekeeping places a fundamental limit to quantum computers and their applications. The team claims that even tiny timing errors add up to place a significant impact on any large-scale algorithm, posing another problem that must eventually be solved if quantum computers are to fulfill the lofty aspirations that society has for them.

Controlling waves in magnets with superconductors for the first time

26 October 2023 at 21:14
Quantum physicists have shown that it's possible to control and manipulate spin waves on a chip using superconductors for the first time. These tiny waves in magnets may offer an alternative to electronics in the future, interesting for energy-efficient information technology or connecting pieces in a quantum computer, for example. The breakthrough primarily gives physicists new insight into the interaction between magnets and superconductors.

New quantum effect demonstrated for the first time: Spinaron, a rugby in a ball pit

26 October 2023 at 17:16
Experimental physicists have demonstrated a new quantum effect aptly named the 'spinaron.' In a meticulously controlled environment and using an advanced set of instruments, they managed to prove the unusual state a cobalt atom assumes on a copper surface. This revelation challenges the long-held Kondo effect -- a theoretical concept developed in the 1960s, and which has been considered the standard model for the interaction of magnetic materials with metals since the 1980s.

Using sound to test devices, control qubits

25 October 2023 at 21:38
Researchers have developed a system that uses atomic vacancies in silicon carbide to measure the stability and quality of acoustic resonators. What's more, these vacancies could also be used for acoustically-controlled quantum information processing, providing a new way to manipulate quantum states embedded in this commonly-used material. 

Atom Computing Says Its New Quantum Computer Has Over 1,000 Qubits

The scale of quantum computers is growing quickly. In 2022, IBM took the top spot with its 433-qubit Osprey chip. Yesterday, Atom Computing announced they’ve one-upped IBM with a 1,180-qubit neutral atom quantum computer.

The new machine runs on a tiny grid of atoms held in place and manipulated by lasers in a vacuum chamber. The company’s first 100-qubit prototype was a 10-by-10 grid of strontium atoms. The new system is a 35-by-35 grid of ytterbium atoms (shown above). (The machine has space for 1,225 atoms, but Atom has so far run tests with 1,180.)

Quantum computing researchers are working on a range of qubits—the quantum equivalent of bits represented by transistors in traditional computing—including tiny superconducting loops of wire (Google and IBM), trapped ions (IonQ), and photons, among others. But Atom Computing and other companies, like QuEra, believe neutral atoms—that is, atoms with no electric charge—have greater potential to scale.

This is because neutral atoms can maintain their quantum state longer, and they’re naturally abundant and identical. Superconducting qubits are more susceptible to noise and manufacturing flaws. Neutral atoms can also be packed more tightly into the same space as they have no charge that might interfere with neighbors and can be controlled wirelessly. And neutral atoms allow for a room-temperature set-up, as opposed to the near-absolute zero temperatures required by other quantum computers.

The company may be onto something. They’ve now increased the number of qubits in their machine by an order of magnitude in just two years, and believe they can go further. In a video explaining the technology, Atom CEO Rob Hays says they see “a path to scale to millions of qubits in less than a cubic centimeter.”

“We think that the amount of challenge we had to face to go from 100 to 1,000 is probably significantly higher than the amount of challenges we’re gonna face when going to whatever we want to go to next—10,000, 100,000,” Atom cofounder and CTO Ben Bloom told Ars Technica.

But scale isn’t everything.

Quantum computers are extremely finicky. Qubits can be knocked out of quantum states by stray magnetic fields or gas particles. The more this happens, the less reliable the calculations. Whereas scaling got a lot of attention a few years ago, the focus has shifted to error-correction in service of scale. Indeed, Atom Computing’s new computer is bigger, but not necessarily more powerful. The whole thing can’t yet be used to run a single calculation, for example, due to the accumulation of errors as the qubit count rises.

There has been recent movement on this front, however. Earlier this year, the company demonstrated the ability to check for errors mid-calculation and potentially fix those errors without disturbing the calculation itself. They also need to keep errors to a minimum overall by increasing the fidelity of their qubits. Recent papers, each showing encouraging progress in low-error approaches to neutral atom quantum computing, give fresh life to the endeavor. Reducing errors may be, in part, an engineering problem that can be solved with better equipment and design.

“The thing that has held back neutral atoms, until those papers have been published, have just been all the classical stuff we use to control the neutral atoms,” Bloom said. “And what that has essentially shown is that if you can work on the classical stuff—work with engineering firms, work with laser manufacturers (which is something we’re doing)—you can actually push down all that noise. And now all of a sudden, you’re left with this incredibly, incredibly pure quantum system.”

In addition to error-correction in neutral atom quantum computers, IBM announced this year they’ve developed error correction codes for quantum computing that could reduce the number of necessary qubits needed by an order of magnitude.

Still, even with error-correction, large-scale, fault-tolerant quantum computers will need hundreds of thousands or millions of physical qubits. And other challenges—such as how long it takes to move and entangle increasingly large numbers of atoms—exist too. Better understanding and working to solve these challenges is why Atom Computing is chasing scale at the same time as error-correction.

In the meantime, the new machine can be used on smaller problems. Bloom said if a customer is interested in running a 50-qubit algorithm—the company is aiming to offer the computer to partners next year—they’d run it multiple times using the whole computer to arrive at a reliable answer more quickly.

In a field of giants like Google and IBM, it’s impressive a startup has scaled their machines so quickly. But Atom Computing’s 1,000-qubit mark isn’t likely to stand alone for long. IBM is planning to complete its 1,121-qubit Condor chip later this year. The company is also pursuing a modular approach—not unlike the multi-chip processors common in laptops and phones—where scale is achieved by linking many smaller chips.

We’re still in the nascent stages of quantum computing. The machines are useful for research and experimentation but not practical problems. Multiple approaches making progress in scale and error correction—two of the field’s grand challenges—is encouraging. If that momentum continues in the coming years, one of these machines may finally solve the first useful problem that no traditional computer ever could.

Image Credit: Atom Computing

This Brain-Like IBM Chip Could Drastically Cut the Cost of AI

The brain is an exceptionally powerful computing machine. Scientists have long tried to recreate its inner workings in mechanical minds.

A team from IBM may have cracked the code with NorthPole, a fully digital chip that mimics the brain’s structure and efficiency. When pitted against state-of-the-art graphics processing units (GPUs)—the chips most commonly used to run AI programs—IBM’s brain-like chip triumphed in several standard tests, while using up to 96 percent less energy.

IBM is no stranger to brain-inspired chips. From TrueNorth to SpiNNaker, they’ve spent a decade tapping into the brain’s architecture to better run AI algorithms.

Project to project, the goal has been the same: How can we build faster, more energy efficient chips that allow smaller devices—like our phones or computers in self-driving cars—to run AI on the “edge.” Edge computing can monitor and respond to problems in real-time without needing to send requests to remote server farms in the cloud. Like switching from dial-up modems to fiber-optic internet, these chips could also speed up large AI models with minimal energy costs.

The problem? The brain is analog. Traditional computer chips, in contrast, use digital processing—0s and 1s. If you’ve ever tried to convert an old VHS tape into a digital file, you’ll know it’s not a straightforward process. So far, most chips that mimic the brain use analog computing. Unfortunately, these systems are noisy and errors can easily slip through.

With NorthPole, IBM went completely digital. Tightly packing 22 billion transistors onto 256 cores, the chip takes its cues from the brain by placing computing and memory modules next to each other. Faced with a task, each core takes on a part of a problem. However, like nerve fibers in the brain, long-range connections link modules, so they can exchange information too.

This sharing is an “innovation,” said Drs. Subramanian Iyer and Vwani Roychowdhury at the University of California, Los Angeles (UCLA), who were not involved in the study.

The chip is especially relevant in light of increasingly costly, power-hungry AI models. Because NorthPole is fully digital, it also dovetails with existing manufacturing processes—the packaging of transistors and wired connections—potentially making it easier to produce at scale.

The chip represents “neural inference at the frontier of energy, space and time,” the authors wrote in their paper, published in Science.

Mind Versus Machine

From DALL-E to ChatGTP, generative AI has taken the world by storm with its shockingly human-like text-based responses and images.

But to study author Dr. Dharmendra S. Modha, generative AI is on an unsustainable path. The software is trained on billions of examples—often scraped from the web—to generate responses. Both creating the algorithms and running them requires massive amounts of computing power, resulting in high costs, processing delays, and a large carbon footprint.

These popular AI models are loosely inspired by the brain’s inner workings. But they don’t mesh well with our current computers. The brain processes and stores memories in the same location. Computers, in contrast, divide memory and processing into separate blocks. This setup shuttles data back and forth for each computation, and traffic can stack up, causing bottlenecks, delays, and wasted energy.

It’s a “data movement crisis,” wrote the team. We need “dramatically more computationally-efficient methods.”

One idea is to build analog computing chips similar to how the brain functions. Rather than processing data using a system of discrete 0s and 1s—like on-or-off light switches—these chips function more like light dimmers. Because each computing “node” can capture multiple states, this type of computing is faster and more energy efficient.

Unfortunately, analog chips also suffer from errors and noise. Similar to adjusting a switch with a light dimmer, even a slight mistake can alter the output. Although flexible and energy efficient, the chips are difficult to work with when processing large AI models.

A Match Made in Heaven

What if we combined the flexibility of neurons with the reliability of digital processors?

That’s the driving concept for NorthPole. The result is a stamp-sized chip that can beat the best GPUs in several standard tests.

The team’s first step was to distribute data processing across multiple cores, while keeping memory and computing modules inside each core physically close.

Previous analog chips, like IBM’s TrueNorth, used a special material to combine computation and memory in one location. Instead of going analog with non-standard materials, the NorthPole chip places standard memory and processing components next to each other.

The rest of NorthPole’s design borrows from the brain’s larger organization.

The chip has a distributed array of cores like the cortex, the outermost layer of the brain responsible for sensing, reasoning, and decision-making. Each part of the cortex processes different types of information, but it also shares computations and broadcasts results throughout the region.

Inspired by these communication channels, the team built two networks on the chip to democratize memory. Like neurons in the cortex, each core can access computations within itself, but also has access to a global memory. This setup removes hierarchy in data processing, allowing all cores to tackle a problem simultaneously while also sharing their results—thereby eliminating a common bottleneck in computation.

The team also developed software that cleverly delegates a problem in both space and time to each core—making sure no computing resources go to waste or collide with each other.

The software “exploits the full capabilities of the [chip’s] architecture,” they explained in the paper, while helping integrate “existing applications and workflows” into the chip.

Compared to TrueNorth, IBM’s previous brain-inspired analog chip, NorthPole can support AI models that are 640 times larger, involving 3,000 times more computations. All that with just four times the number of transistors.

A Digital Brain Processor

The team next pitted NorthPole against several GPU chips in a series of performance tests.

NorthPole was 25 times more efficient when challenged with the same problem. The chip also processed data at lighting-fast speeds compared to GPUs on two difficult AI benchmark tests.

Based on initial tests, NorthPole is already usable for real-time facial recognition or deciphering language. In theory, its fast response time could also guide self-driving cars in split-second decisions.

Computer chips are at a crossroads. Some experts believe that Moore’s law—which posits that the number of transistors on a chip doubles every two years—is at death’s door. Although still in their infancy, alternative computing structures, such as brain-like hardware and quantum computing, are gaining steam.

But NorthPole shows semiconductor technology still has much to give. Currently, there are 37 million transistors per square millimeter on the chip. But based on projections, the setup could easily expand to two billion, allowing larger algorithms to run on a single chip.

“Architecture trumps Moore’s law,” wrote the team.

They believe innovation in chip design, like NorthPole, could provide near-term solutions in the development of increasingly powerful but resource-hungry AI.

Image Credit: IBM

Electrical control of quantum phenomenon could improve future electronic devices

19 October 2023 at 15:12
A new electrical method to conveniently change the direction of electron flow in some quantum materials could have implications for the development of next-generation electronic devices and quantum computers. A team of researchers has developed and demonstrated the method in materials that exhibit the quantum anomalous Hall (QAH) effect -- a phenomenon in which the flow of electrons along the edge of a material does not lose energy.

Quantum Computers in 2023: Where They Are Now and What’s Next

In June, an IBM computing executive claimed quantum computers were entering the “utility” phase, in which high-tech experimental devices become useful. In September, Australia’s chief scientist Cathy Foley went so far as to declare “the dawn of the quantum era.”

This week, Australian physicist Michelle Simmons won the nation’s top science award for her work on developing silicon-based quantum computers.

Obviously, quantum computers are having a moment. But—to step back a little—what exactly are they?

What Is a Quantum Computer?

One way to think about computers is in terms of the kinds of numbers they work with.

The digital computers we use every day rely on whole numbers (or integers), representing information as strings of zeroes and ones which they rearrange according to complicated rules. There are also analog computers, which represent information as continuously varying numbers (or real numbers), manipulated via electrical circuits or spinning rotors or moving fluids.

In the 16th century, the Italian mathematician Girolamo Cardano invented another kind of number called complex numbers to solve seemingly impossible tasks such as finding the square root of a negative number. In the 20th century, with the advent of quantum physics, it turned out complex numbers also naturally describe the fine details of light and matter.

In the 1990s, physics and computer science collided when it was discovered that some problems could be solved much faster with algorithms that work directly with complex numbers as encoded in quantum physics.

The next logical step was to build devices that work with light and matter to do those calculations for us automatically. This was the birth of quantum computing.

Why Does Quantum Computing Matter?

We usually think of the things our computers do in terms that mean something to us— balance my spreadsheet, transmit my live video, find my ride to the airport. However, all of these are ultimately computational problems, phrased in mathematical language.

As quantum computing is still a nascent field, most of the problems we know quantum computers will solve are phrased in abstract mathematics. Some of these will have “real world” applications we can’t yet foresee, but others will find a more immediate impact.

One early application will be cryptography. Quantum computers will be able to crack today’s internet encryption algorithms, so we will need quantum-resistant cryptographic technology. Provably secure cryptography and a fully quantum internet would use quantum computing technology.

A microscopic view of a square, iridescent computer chip against an orange background.
Google has claimed its Sycamore quantum processor can outperform classical computers at certain tasks. Image Credit: Google

In materials science, quantum computers will be able to simulate molecular structures at the atomic scale, making it faster and easier to discover new and interesting materials. This may have significant applications in batteries, pharmaceuticals, fertilizers, and other chemistry-based domains.

Quantum computers will also speed up many difficult optimization problems, where we want to find the “best” way to do something. This will allow us to tackle larger-scale problems in areas such as logistics, finance, and weather forecasting.

Machine learning is another area where quantum computers may accelerate progress. This could happen indirectly, by speeding up subroutines in digital computers, or directly if quantum computers can be reimagined as learning machines.

What Is the Current Landscape?

In 2023, quantum computing is moving out of the basement laboratories of university physics departments and into industrial research and development facilities. The move is backed by the checkbooks of multinational corporations and venture capitalists.

Contemporary quantum computing prototypes—built by IBM, Google, IonQ, Rigetti, and others—are still some way from perfection.

Today’s machines are of modest size and susceptible to errors, in what has been called the “noisy intermediate-scale quantum” phase of development. The delicate nature of tiny quantum systems means they are prone to many sources of error, and correcting these errors is a major technical hurdle.

The holy grail is a large-scale quantum computer which can correct its own errors. A whole ecosystem of research factions and commercial enterprises are pursuing this goal via diverse technological approaches.

Superconductors, Ions, Silicon, Photons

The current leading approach uses loops of electric current inside superconducting circuits to store and manipulate information. This is the technology adopted by Google, IBM, Rigetti, and others.

Another method, the “trapped ion” technology, works with groups of electrically charged atomic particles, using the inherent stability of the particles to reduce errors. This approach has been spearheaded by IonQ and Honeywell.

Illustration showing glowing dots and patterns of light.
An artist’s impression of a semiconductor-based quantum computer. Image Credit: Silicon Quantum Computing

A third route of exploration is to confine electrons within tiny particles of semiconductor material, which could then be melded into the well-established silicon technology of classical computing. Silicon Quantum Computing is pursuing this angle.

Yet another direction is to use individual particles of light (photons), which can be manipulated with high fidelity. A company called PsiQuantum is designing intricate “guided light” circuits to perform quantum computations.

There is no clear winner yet from among these technologies, and it may well be a hybrid approach that ultimately prevails.

Where Will the Quantum Future Take Us?

Attempting to forecast the future of quantum computing today is akin to predicting flying cars and ending up with cameras in our phones instead. Nevertheless, there are a few milestones that many researchers would agree are likely to be reached in the next decade.

Better error correction is a big one. We expect to see a transition from the era of noisy devices to small devices that can sustain computation through active error correction.

Another is the advent of post-quantum cryptography. This means the establishment and adoption of cryptographic standards that can’t easily be broken by quantum computers.

Commercial spin-offs of technology such as quantum sensing are also on the horizon.

The demonstration of a genuine “quantum advantage” will also be a likely development. This means a compelling application where a quantum device is unarguably superior to the digital alternative.

And a stretch goal for the coming decade is the creation of a large-scale quantum computer free of errors (with active error correction).

When this has been achieved, we can be confident the 21st century will be the “quantum era.”

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image Credit: A complex cooling rig is needed to maintain the ultracold working temperatures required by a superconducting quantum computer / IBM

Self-correcting quantum computers within reach?

12 October 2023 at 15:17
Quantum computers promise to reach speeds and efficiencies impossible for even the fastest supercomputers of today. Yet the technology hasn't seen much scale-up and commercialization largely due to its inability to self-correct. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way. Now, a new paper illustrates a quantum computing platform's potential to solve the longstanding problem known as quantum error correction.
❌
❌