Quantum computing might sound like the kind of sci-fi plot device used to explain how a character traveled through time or shrank to the size of an ant. But the technology is very real, and Silicon Valley is increasingly jumping on the quantum train.
Amazon (AMZN), Google (GOOG, GOOGL), and Microsoft (MSFT) have each announced their own quantum computing chips over the past few months. Nvidia (NVDA) is preparing to host its first-ever Quantum Day as part of its annual GTC developer conference on March 20. IBM (IBM), Intel (INTC), and a slew of quantum-focused companies also have their own chips.
Quantum computers are expected to be able to complete calculations in minutes and hours that would take classic computers, the kind you and I use every day, thousands of years to finish. What does that mean in a practical sense? The potential for massive advancements in material sciences, chemistry, medicine, and more. After all, scientists and researchers will be able to run calculations they’d otherwise only be able to dream of with today’s computers.
But what exactly is a quantum computer, and what makes it so special? We’ve got the answers you’re looking for.
To better understand quantum computers, let’s talk about classical computers first. Your laptop, smartphone, heck, even your smartwatch have processors known as central processing units (CPUs). These are the brains of modern computers.
Every CPU has what are called transistors. Think of transistors as little switches inside your computer that react to an electrical signal. You’ve probably heard chipmakers brag about how many billions of transistors their chips have. Apple talks up the 28 billion transistors on its M4 chips for its Mac and iPad, while Nvidia says its Blackwell chips have 208 billion transistors.
Computer processors. (Mustafa Ciftci/Anadolu Agency via Getty Images) ·Anadolu Agency via Getty Images
CPUs translate the apps and programs you use every day via those electrical pulses using what is known as binary code, a kind of machine language made up of 1s and 0s. Each 1 and 0 is called a bit.
Strings of bits make up the complex series of instructions that, after running through modern programming languages, are translated into the videos and images you see on your screen, the games you play, and the apps you use throughout the workday.
One series of electrical pulses could travel through a chip’s transistors, creating various 1s and 0s along the way. The chip will then interpret those signals to access your computer’s memory or storage. As the commands increase in complexity, they’re able to interpret everything from a swipe of your finger on your phone’s screen to what to do when you click on a link on a website.
These binary systems are used throughout computing, whether that’s your home computer, medical equipment, cars, or virtually any other piece of technology that requires a CPU.
Quantum computers, however, use subatomic objects known as quantum bits, or qubits, rather than classic bits. This is a completely different way of storing and processing data. Rather than existing as a 1 or a 0 like a bit, qubits can exist as a 1 and 0 or any combination of the two at the same time, thanks to what is called superposition.
“Think of a qubit as a sphere,” explained Oskar Painter, director of quantum hardware at Amazon Web Services. “There’s a north and south pole, and it can be any combination of 0 and 1 simultaneously.”
To make things even stranger, scientists don’t actually know whether a qubit is a 1 or a 0 until they observe it.
“In the quantum world, the system itself can simultaneously be in several different states,” NYU assistant professor of applied physics Rupak Chatterjee told Yahoo Finance. “And until we measure, we don’t really know, necessarily, which exact state it’s in.”
That ability to exist as a 1 and a 0 enables what is called parallel processing, which allows quantum computers to solve calculations much faster than classic computers.
There are multiple ways to manipulate qubits, whether that’s through superconducting systems that lower the temperature around the qubits to near absolute zero, using lasers to interact with atoms, or taking advantage of light particles — photons — in an extreme vacuum.
Those giant, gold, octopus-looking machines you see that are often referred to as quantum computers? They’re largely cooling systems working with superconducting qubits to ensure they’re sufficiently cold to operate properly.
A model of the Quantum System Two quantum computer displayed at the opening of IBM’s first quantum data center. (Marijan Murat/picture alliance via Getty Images) ·picture alliance via Getty Images
What’s more, quantum computers don’t exist on their own. They connect to traditional computers, which control how the quantum system should interact with a qubit and provide a readout of its performance.
So, quantum computers can use qubits’ superposition to process calculations at incredibly fast speeds compared to our classic computers. Then why aren’t we using them to solve the world’s most complex problems yet? Because of quantum errors.
“[Qubits] are very, very fragile,” explained Sridhar Tayur, professor of operations management at Carnegie Mellon University’s Tepper School of Business. “You can’t have 0 and 1 at the same time if it interacts with the environment.”
See, because qubits exist at such a small scale, they’re incredibly susceptible to interference from the outside world. That means even a single atom could throw off the qubit, causing it to lose the information — that 1 and 0.
According to Painter, quantum systems can have an error every 300 quantum operations. In order for a quantum computer to be useful, though, it can only exhibit an error every trillion operations. That’s a dramatic difference.
To address this, companies use what is known as quantum error correction. Microsoft, Google, and Amazon focused on this with their latest chips. The idea behind quantum error correction is to use redundant qubits.
“If I want to protect one qubit of information, I need to actually replicate it 1,000 times in order to get a single good, what we call logical qubit,” Painter said.
But that takes up a huge amount of resources. Researchers are working to address the issue, but until they can, quantum computers won’t be reliable enough to use for calculations related to chemistry, to help discover new compounds, or to understand how atoms interact with each other.
While quantum computers are helpful for running specific algorithms and solving incredibly complex problems, they won’t replace classic supercomputers even after researchers are able to overcome the issue with quantum errors.
That’s because they’re not meant to do things like run your favorite apps faster than your smartphone. What’s more, there’s simply no reason to use a quantum computer for such a task when classic computers are designed to handle exactly that.
In other words, don’t expect to come home, turn on your personal quantum computer, and start streaming Netflix. Your Roku can handle that just fine on its own.
Sign up for Yahoo Finance’s Week in Tech newsletter. ·yahoofinance
Email Daniel Howley at dhowley@yahoofinance.com. Follow him on Twitter at @DanielHowley.
Click here for the latest technology news that will impact the stock market
Read the latest financial and business news from Yahoo Finance