A consortium of researchers from MIT, ETH Zurich, and a leading quantum hardware company has announced a milestone that experts are calling the most significant advance in computing since the invention of the transistor: the achievement of stable 1000-qubit quantum computation with error rates below 0.1%.
The breakthrough, published simultaneously in Nature and Science, solves the central engineering challenge that has prevented quantum computers from moving beyond laboratory demonstrations — qubit decoherence, the tendency of quantum systems to lose their delicate quantum states through interaction with the environment.
What This Means in Practice
The implications are simultaneously thrilling and concerning. On the positive side, quantum computers of this capability class can simulate molecular interactions with unprecedented accuracy, potentially compressing decades of pharmaceutical research into years. Problems in materials science, climate modelling, and logistics optimisation that are computationally intractable on classical computers become tractable.
On the security side, the same computational power threatens current encryption standards. RSA-2048, which protects most internet communications and financial transactions, becomes vulnerable to quantum attack at these qubit levels.