I am approaching this one cautiously both out of concern for confirmation bias and because I know so little about the subject, but this pessimistic take by Mikhail Dyakonov on the short-term prospects of quantum computing raises troubling questions about the coverage of this field and about the way hype undermines the allocation of resources.
The pattern here is disturbingly familiar. We've seen it with AI, fusion reactors, maglev vactrains, subliminal framing, just to name a few. Credulous reporters seek out optimistic sources. Theoretical possibilities are treated as just-around-the-corner developments. Decades of slow progress, false starts, and sometimes outright failure are ignored.
Those who can claim some association with the next big thing are richly rewarded. Entrepreneurs get enormous piles of venture capital. Business lines and academic departments get generous funding. Researchers who can pull off a slick TED Talk get six-figure book deals and fawning celebrity treatment.
Just to be clear, Dyakonov's is not the consensus opinion. Lots of his colleagues are very optimistic, but these concerns do seem to be valid. The fact that almost all of the coverage glosses over that part of the picture tells us something about the state of science journalism.
From The Case Against Quantum Computing [emphasis added]
Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.
We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.
Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.
It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.
In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view. It’s based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.
In the early 2000s, at the request of the Advanced Research and Development Activity (a funding agency of the U.S. intelligence community that is now part of Intelligence Advanced Research Projects Activity), a team of distinguished experts in quantum information established a road map for quantum computing. It had a goal for 2012 that “requires on the order of 50 physical qubits” and “exercises multiple logical qubits through the full range of operations required for fault-tolerant [quantum computation] in order to perform a simple instance of a relevant quantum algorithm….” It’s now the end of 2018, and that ability has still not been demonstrated.
Post a Comment