
Quantum computing is entering a pivotal phase. A new study reveals that the field has moved beyond early demonstrations and is now facing its defining challenge: scale.
The researchers highlight a sobering benchmark. Many meaningful applications – from chemistry to cryptography – may require millions of physical qubits, far beyond what any platform can deliver today.
That reality reframes the conversation. The question is no longer whether quantum hardware works, but how engineers can build larger, more reliable systems that push past laboratory limits.
A team led by David Awschalom, a professor of molecular engineering and physics at the University of Chicago, compared six leading hardware platforms across computing, networking, simulation, and sensing.
They used the technology readiness level, a nine-step NASA scale for maturity. The scale captures how close a technology is to working outside the lab.
Superconducting qubits lead in computing prototypes, neutral atoms shine in simulation, photonic qubits anchor networking, and spin defects set the pace in sensing. Several are already accessible through public cloud services.
Wiring and control channels do not grow nicely with qubit count. Engineers even coined a term for it: “the tyranny of numbers,” complete with an IEEE history behind it.
Power delivery, temperature, calibration, and automation become central design problems as systems scale. The review calls for foundry-level fabrication, better materials, and top-down system engineering.
The experts argue for modular machines linked by quantum interconnects, hardware links that carry quantum states between modules. That approach can cap the wiring and power load inside each module.
Partnerships sit at the center of quantum progress because no single group can handle the technical demands alone.
Universities bring theory and early prototypes, while national labs supply specialized tools that help measure fragile quantum behavior.
Companies push for designs that can survive real operating conditions – something academic labs alone cannot easily test.
Funding plays a similar role by determining which ideas move from theory to devices that people can actually use.
Government programs in the United States and Europe have poured money into shared facilities that give researchers access to advanced fabrication tools.
Private investment adds pressure to show steady progress, though it can also create mismatched expectations about timelines.
Standards are becoming a quiet source of momentum. Groups working on superconducting qubits need shared methods for measuring fidelity, while teams in photonics push for uniform fabrication rules.
These efforts help different research centers compare progress without wasting time reinventing the same tests.
Education programs add another layer. Universities are building quantum engineering tracks that blend physics, materials science, and computer science.
The talent gap is wide today, and training new specialists will shape how quickly companies can build devices that scale.
High readiness does not mean high performance yet. “Quantum technologies today are transitioning from laboratory curiosities to technical reality,” said Awschalom.
TRLs capture maturity, not final capability. A platform can be ready for field tests while still years away from the fault-tolerant performance needed for heavy-duty computing.
There are signs of real traction in error correction. A 2023 experiment showed that increasing code size can reduce logical error on a superconducting device.
Here is the key concept that follows: a logical qubit, an error-protected qubit built from many physical qubits, trades hardware overhead for reliability.
The heavy hitters stretch the tally. Factoring a 2048-bit RSA key has been estimated at about 20 million physical qubits and hours of runtime on a future machine.
Chemistry is also demanding at scale. A separate study estimated the need for roughly four million physical qubits to model the FeMoco active site with fault-tolerant methods.
These are moving targets as algorithms and hardware improve. They still make one point plain: scaling is the field’s real challenge.
Materials and fabrication sit at the heart of progress. The report points to lithography, device uniformity, and low-loss photonics as make-or-break elements for mature systems.
“Patience has been a key element in many landmark developments,” said Awschalom.
A clear through-line appears: strong ties between universities, companies, and government labs built microelectronics, and the same coalition is now building quantum.
Standards and shared methods will matter as complexity grows. Open knowledge, broad training, and shared testbeds will determine whether quantum technology advances in sync – or splinters under its own complexity.
The study is published in the journal Science.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–
