Quantum Computers Run on Just the Right Amount of Connectivity
Scientists know that entanglement, a special connection that intertwines the fate of quantum particles, is a crucial ingredient for quantum computers. Without it, a quantum computer loses its ability to harness the fullness of quantum complexity—that special sauce that makes the quantum world impossible to emulate on ordinary computers. But whether entanglement is the only key, and exactly how much of it is needed, no one really knows.
Quantum Computers Are Starting to Simulate the World of Subatomic Particles
There is a heated race to make quantum computers deliver practical results. But this race isn't just about making better technology—usually defined in terms of having fewer errors and more qubits, which are the basic building blocks that store quantum information. At least for now, the quantum computing race requires grappling with the complex realities of both quantum technologies and difficult problems.
New Perspective Blends Quantum and Classical to Understand Quantum Rates of Change
There is nothing permanent except change. This is perhaps never truer than in the fickle and fluctuating world of quantum mechanics. The quantum world is in constant flux. The properties of quantum particles flit between discrete, quantized states without any possibility of ever being found in an intermediate state. How quantum states change defies normal intuition and remains the topic of active debate—for both scientists and philosophers.
Tug-of-War Unlocks Menagerie of Quantum Phases of Matter
Often when physicists study phases of matter they examine how a solid slab of metal or a cloud of gas changes as it gets hotter or colder. Sometimes the changes are routine—we’ve all boiled water to cook pasta and frozen it to chill our drinks. Other times the transformations are astonishing, like when certain metals get cold enough to become superconductors or a gas heats up and breaks apart into a glowing plasma soup. However, changing the temperature is only one way to transmute matter into different phases. Scientists also blast samples with strong electric or magnetic fields or place them in special chambers and dial up the pressure. In these experiments, researchers are hunting for a stark transition in a material’s behavior or a change in the way its atoms are organized. In a new paper published recently in the journal Physical Review Letters, Barkeshli and two colleagues continued this tradition of exploring how materials respond to their environment. But instead of looking for changes in conductivity or molecular structure, they focused on changes in a uniquely quantum property: entanglement, or the degree to which quantum particles give up their individuality and become correlated with each other.
Foundational Step Shows Quantum Computers Can Be Better Than the Sum of Their Parts
Pobody’s nerfect—not even the indifferent, calculating bits that are the foundation of computers. But JQI Fellow Christopher Monroe’s group, together with colleagues from Duke University, have made progress toward ensuring we can trust the results of quantum computers even when they are built from pieces that sometimes fail. They have shown in an experiment, for the first time, that an assembly of quantum computing pieces can be better than the worst parts used to make it. In a paper published in the journal Nature on Oct. 4, 2021, the team shared how they took this landmark step toward reliable, practical quantum computers. In their experiment, the researchers combined several qubits—the quantum version of bits—so that they functioned together as a single unit called a logical qubit. They created the logical qubit based on a quantum error correction code so that, unlike for the individual physical qubits, errors can be easily detected and corrected, and they made it to be fault-tolerant—capable of containing errors to minimize their negative effects. This is the first time that a logical qubit has been shown to be more reliable than the most error-prone step required to make it.
Researchers Uncover a ‘Shortcut’ to Thermodynamic Calculations Using Quantum Computers
A collaboration between researchers at JQI and North Carolina State University has developed a new method that uses a quantum computer to measure the thermodynamic properties of a system. The team shared the new approach in a paper published August 18, 2021, in the journal Science Advances.
New Approach to Information Transfer Reaches Quantum Speed Limit
Even though quantum computers are a young technology and aren’t yet ready for routine practical use, researchers have already been investigating the theoretical constraints that will bound quantum technologies. One of the things researchers have discovered is that there are limits to how quickly quantum information can race across any quantum device. These speed limits are called Lieb-Robinson bounds, and, for several years, some of the bounds have taunted researchers: For certain tasks, there was a gap between the best speeds allowed by theory and the speeds possible with the best algorithms anyone had designed. It’s as though no car manufacturer could figure out how to make a model that reached the local highway limit. But unlike speed limits on roadways, information speed limits can’t be ignored when you’re in a hurry—they are the inevitable results of the fundamental laws of physics. For any quantum task, there is a limit to how quickly interactions can make their influence felt (and thus transfer information) a certain distance away. The underlying rules define the best performance that is possible. In this way, information speed limits are more like the max score on an old school arcade game than traffic laws, and achieving the ultimate score is an alluring prize for scientists. Now a team of researchers, led by JQI Fellow Alexey Gorshkov, have found a quantum protocol that reaches the theoretical speed limits for certain quantum tasks. Their result provides new insight into designing optimal quantum algorithms and proves that there hasn’t been a lower, undiscovered limit thwarting attempts to make better designs. Gorshkov, who is also a Fellow of the Joint Center for Quantum Information and Computer Science (QuICS) and a physicist at the National Institute of Standards and Technology, and his colleagues presented their new protocol in a recent article published in the journal Physical Review X.
Proposal Shows How Noisy Qubits Might Correct Themselves
One of the chief obstacles facing quantum computer designers—correcting the errors that creep into a processor’s calculations—could be overcome with a new approach by physicists from and the California Institute of Technology, who may have found a way to design quantum memory switches that would self-correct. The team’s theory paper, which was published Dec. 8, 2020 in the journal Physical Review Letters, suggests an easier path to creating stable quantum bits, or qubits, which ordinarily are subject to environmental disturbances and errors. Finding methods of correcting these errors is a major issue in quantum computer development, but the research team’s approach to qubit design could sidestep the problem.
New $115 Million Quantum Systems Accelerator to Pioneer Quantum Technologies for Discovery Science
The Department of Energy (DOE) has awarded $115 million over five years to the Quantum Systems Accelerator (QSA), a new research center led by Lawrence Berkeley National Laboratory (Berkeley Lab) that will forge the technological solutions needed to harness quantum information science for discoveries that benefit the world. It will also energize the nation’s research community to ensure U.S. leadership in quantum R&D and accelerate the transfer of quantum technologies from the lab to the marketplace. Sandia National Laboratories is the lead partner of the center.
Quantum Computers Do the (Instantaneous) Twist
Regardless of what makes up the innards of a quantum computer, its speedy calculations all boil down to sequences of simple instructions applied to qubits—the basic units of information inside a quantum computer.
Now, researchers at JQI have discovered ways to implement robust, error-resistant gates using just a constant number of simple building blocks—achieving essentially the best reduction possible in a parameter called circuit depth.