A Lone Nvidia GPU Speeds Past The Physics-straining Might Of A Quantum Computer – In These Apps At Least
A group of researchers from Microsoft and the Scalable Parallel Computing Laboratory in Zurich have offered a harsh reality check to those hyping the world altering potential of quantum computers, by finding that off-the-shelf GPUs can sometimes do better than machines from the frontiers of physics.
Drug discovery, material sciences, scheduling, traffic congestion, supply chain, and weather forecasting are all commonly cited applications that vendors say quantum computing well suited.
But in a paper published in the journal of the Association for Computing Machinery, Torsten Hoefler, director of the Scalable Parallel Computing Laboratory, alongside former Microsoft researcher Thomas Häner, and Microsoft's Matthias Troyer concluded that, short of exceptional improvements in hardware and software, even future quantum systems are unlikely to achieve practical speeds in many of these workloads.
For a quantum system to be worthwhile, it needs to be able to perform a task faster than a conventional system, and to test this, the team pitted a hypothetical quantum system with 10,000 error-correcting qubits, or about a million physical qubits, against a classical computer equipped with a single Nvidia A100 GPU.
To be clear, no such quantum system exists today. The most advanced quantum computers currently available top out at a few hundred physical qubits. IBM's Osprey system, for instance, packs 433 qubits. And while IBM says it's on track to deliver a 4,158-qubit system in 2025, even that's well short of the system envisioned by Hoefler and his co-authors. On the flip side, as high-performance computing (HPC) systems go, the conventional system considered in this paper is positively anemic.
"For our analysis, we set a break-even point of two weeks, meaning a quantum computer should be able to perform better than a classical computer on problems that would take a quantum computer no more than two weeks to solve," Troyer explained in a blog post published Monday.
The comparison, according to the authors, revealed a glaring problem with most quantum algorithms today. A quadratic speedup, like that enabled by Grover's algorithm, is insufficient to achieve an advantage over conventional systems. Instead, "super-quadratic or ideally exponential speedups" are needed.
- You can cross 'Quantum computers to smash crypto' off your list of existential fears for 30 years
- Quantum computing: Hype or reality? OVH says businesses would be better off prepared
- India gives itself a mission to develop a 1000-qubit quantum computer in just eight years
- Amazon: Diamonds are a quantum network's best friend
That's not the only problem facing quantum architectures. Input and output (I/O) bandwidth is another limiting factor.
"Our research revealed that applications that rely on large datasets are better served by classical computing, because the bandwidth is too low on quantum systems to allow for applications such as searching databases or training machine learning models on large datasets," Troyer explained.
He added that this means that workloads like drug design, protein folding, as well as weather and climate prediction are better suited to conventional workloads, given the current state of the tech.
This doesn't mean that quantum computing is worthless; it simply means that, at least for the foreseeable future, the applications for quantum systems are likely to be narrower than the marketers would have you believe.
Generally, quantum computers will be practical for 'big compute' problems on small data, not big data problems
"Generally, quantum computers will be practical for 'big compute' problems on small data, not big data problems," the researchers wrote.
One such workload likely to benefit from quantum systems are chemical and material sciences. This is because many of these workloads rely on relatively small datasets.
"If quantum computers only benefited chemistry and material science, that would be enough," Troyer emphasized. "Many problems facing the world today boil down to chemistry and material science problems. Better and more efficient electric vehicles rely on finding better battery chemistries. More effective and targeted cancer drugs rely on computational biochemistry."
Cryptoanalysis using Shor's algorithm presents similar challenges, the researchers note. However, not every algorithm capable of an exponential speedup is necessarily well suited to quantum systems. The team notes that while linear algebra has an exponential speed up, this is negated by I/O bottlenecks as soon as the matrix is loaded into memory.
"These considerations help with separating hype from practicality in the search for quantum applications and can guide algorithmic developments," the paper reads. "Our analysis shows it is necessary for the community to focus on super-quadratic speeds, ideally exponential speedups, and one needs to carefully consider I/O bottlenecks." ®
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more