Start your day with intelligence. Get The OODA Daily Pulse.
One of the most difficult problems with quantum computing relates to increasing the size of the quantum computer. Researchers globally are seeking to solve this “challenge of scale.” To bring quantum scaling closer to reality, researchers from 14 institutions collaborated through the Co-design Center for Quantum Advantage (C2QA), a Department of Energy (DOE), Office of Science, National Quantum Information Science Research Center. Together, they constructed the ARQUIN framework—a pipeline to simulate large-scale distributed quantum computers as different layers. Their results were published in ACM Transactions on Quantum Computing. The research team, led by Michael DeMarco from Brookhaven National Laboratory and the Massachusetts Institute of Technology (MIT), started with a standard computing strategy of combining multiple computing “nodes” into one unified computing framework. In theory, this multi-node system can be emulated to enhance quantum computers—but there’s a catch. In superconducting quantum systems, qubits must be kept incredibly cold. This is usually done with the help of a cryogenic device called a dilution refrigerator. The problem is that scaling a quantum computing chip to a sufficiently large size within a single fridge is hard. Even in larger fridges, the superconducting electric circuits within a single chip become difficult to maintain. To create a powerful multi-node quantum computer, researchers need to not only connect nodes inside of one dilution refrigerator, but also to connect the nodes across multiple dilution refrigerators.
Full research : ARQUIN provides framework for simulating distributed quantum computing system.