So you want to build a quantum computer?
Researchers at Argonne National Lab are leaning on multiple scientific disciplines to unlock the necessary components of a successful quantum computer.
LEMONT, Ill. — For all the hype, funding and policy around quantum computing, there is still a lot of basic scientific research to be done to bring a quantum information system to life. Leading researchers at Argonne National Laboratory in Illinois spoke with Nextgov/FCW about the integrative work that bridges the gap between theory and practice, with applications and scalability a distant but feasible goal.
The research is interdisciplinary, combining work in fields like materials science, nanoscale systems, photonic engineering and computer science.
Applying the laws of quantum mechanics to a modern computing method is still in its infancy. Classical computers, whose sole purpose is to process information, do so in binary bits, the 1s and 0s that encode our current networks. A quantum computer, which is still in large part theoretical, would instead process qubits, which exist in a state of superposition — a subatomic particle’s ability to represent any value between 0 and 1 simultaneously, holding a continuous value that is not limited to a discrete binary. This is one of the key laws of quantum mechanics that promises advanced data computing and privacy.
Getting these qubits to function the way classical bits do is a critical challenge to the creation of a viable quantum computer. As Michael Norman, the director of the Argonne Quantum Institute told Nextgov/FCW, qubits are “finicky” to wrangle into computation, thanks to the inherent delicacy of their superpositioned state, and make up only one part of quantum physics people are hoping to exploit for next generation technologies.
The hurdles don’t stop at qubits. Scaling the infrastructure needed for any type of internet involves fiber optic cables, semiconductors, quantum repeaters and more to ensure a level of resilience, or fault-tolerance, at the hardware and software level.
“The problem with quantum networks is that it's a really tough thing,” Norman said. “[Computational researchers] will say: if it's not fault tolerant, it's not a computer.”
Material support
Finding the best system to support quantum-enabled innovation has now become an interdisciplinary effort. Researchers working at Argonne, which hosts one of the Department of Energy’s five quantum sciences research hubs, are focused on creating a scalable and sustainable network for a viable quantum internet.
The hub, called Q-NEXT, oversees several scientific fields making contributions to different aspects of a quantum network.
“It's a very new, nascent field where everyone is making extreme progress,” Alan Dibos, a materials scientist within Q-NEXT, told Nextgov/FCW. Dibos is focused on finding the perfect “host” material — or the building blocks to quantum networking — for the hardware to support qubits. That effort brings in materials science to examine specific substances’ atomic structures and how that interplays with qubits.
One of Dibos’s favorite materials to work with is erbium, specifically erbium three plus ion, a rare earth material whose atomic structure suits qubit computation by offering prolonged spin coherence time –– a crucial property qubits need to store information.
“We're using nanoscience to enable quantum information science by putting your atom in a special kind of cavity, confining it in a certain way, under certain parameters,” he said. “I try to engineer our qubits to be better.”
Aside from erbium, Dibos said that diamond, silicon carbide, simple binary oxides and tertiary oxides are also being explored as host materials, depending on the specific application. The goal is to couple qubits in his lab to photons as a vehicle.
“We actually want to transmit [qubits] over long distances, and photons, in the form of a ‘flying qubit,’ are the most natural way to transmit quantum information over long distances,” he said.
In a parallel to artificial intelligence models’ reliance on data quality as a foundation, Dibos said that materials are just as critical to a viable quantum computer as qubits.
“You have crappy materials, you're never going to get good qubits,” he said. “We're all kind of converging on pushing this … material qubit and host material platform forward. And so it takes a lot of little pushes to get this over.”
The materials work Dibos is conducting serves as one pillar for a larger quantum technology ecosystem. Joe Heremans, a staff scientist within Q-NEXT, described Dibos and Argonne’s materials discovery work as part of the larger quantum transduction mission, or the goal to continue improving quantum information systems and their connectivity.
“In many ways, materials is very much what's limiting –– materials development to try to find new qubits –– but more importantly, standardization of materials to start building or start thinking about scalable technologies based on these things,” Heremans said. “One thing that Q-NEXT kind of stood up are two of these foundries, [with] one here at Argonne that's specific to semiconducting materials for qubits.”
Innovation in this space has attracted attention from multiple leading companies, such as Intel, Microsoft and Boeing.
Infrastructure challenges
Other important pieces to the quantum networking puzzle step away from nanoscale technologies and move into larger infrastructure.
Rajkumar Kettimuthu, a senior scientist at Argonne, discussed the infrastructure overhaul likely needed to support quantum-enabled communications. A computer scientist by training, Kettimuthu is preoccupied with bringing the elements of classical computing to a quantum network.
One of his research arenas is related to the fragility of qubits and how they travel from one node to another through the fiber optic cables that serve as the backbone to modern communication.
“If you put both of them in the same fiber, then you need some level of protection for the quantum signals from the classical signal,” he told Nextgov/FCW.
This need stems from that inherent fragility of qubits. When storing information in a state of superposition, external interference — often called noise — can easily interfere with and ruin the qubit’s computational effort. Noise can be any interruption, such as temperature changes or the presence of excess photons.
Ideally, Kettimuthu said, a coexistence can occur between classical and future quantum network infrastructure. Until then, special fiber cables are required that feature ultra low-noise technology.
“The fiber needs to be manufactured in a certain way, or we need fundamentally different ways of doing it,” he said. “From the networking infrastructure point of view, the fiber is what matters, whereas the qubit platforms, it could be from different things.”
For now, however, Argonne and research partners like the University of Chicago and Starlight in downtown Chicago utilize a dark fiber –– a cable whose sole traffic are the quantum signals sent between institutions –– to test both the signals’ strength and the quantum repeaters to validate how a quantum network could work.
The experimentation here demands participation from experts with diverse scientific backgrounds. Kettimuthu noted that there are aspects of developing a quantum network that demand more work. He emphasized the need for quantum-enabled technologies to remain homogenous as a means to foster interconnectivity within future and existing devices and expand user access, which would benefit from greater government support.
“The scalability is a huge challenge now, so with the demonstrations, we really need to get out of the physics experiment to a real network,” he said. “We should actually do some research on it so the funding federal agencies know this sooner rather than later.”
This type of partnership is going to help usher in the novel engineering techniques needed to scale any type of quantum information network. Salman Habib, the director of Argonne’s Computational Science Division, told Nextgov/FCW that bridging that gap will require teamwork.
“If you want a quantum computer to be as good in certain aspects of the classical machine –– or beat it, in some sense –– then you think in terms of investments in the billion dollar class, which simply means that…in my opinion, a public-private partnership,” he said.
Habib referenced the Defense Advanced Research Projects Agency’s ongoing contracts with leading private sector partners to develop a viable quantum computer as one example of the level of support national labs and academia need to scale new hardware and software. To deliver quantum information machinery that can effectively process information near a classical equivalent, Habib estimates we need around two more decades of work.
“When you have a system that big, it has to be something that can run for an extended period of time,” he said. “The other thing that people don’t talk about that much is: quantum computers are not very good at data-intensive tasks. In fact, they are useless, almost.”
This is a considerable drawback given the advancement and demand for advanced data processing and AI computation abilities. While Habib is a self-proclaimed optimist, he said that the near-term abilities of a quantum computer are far from what some headlines purport.
“Being able to feed a quantum computer a lot of data right now is not possible,” he said. “So quantum computers –– even the ones that would be fault-tolerant in the future –– would be ones that would work on problems that are very easy to specify.”
Habib clarified that near-term quantum computers will be more adept at solving issues within the material science and chemistry realms, but still hinge on a machine with sufficient memory and computing capabilities.
“Everyone is trying to figure out what’s going to be the first thing that will actually lead to a big scientific breakthrough,” he said. “The big question is going to be: how hard is it to scale [quantum computers] up.”