A talk for the curious and the confused
A story about two old men arguing about whether God plays dice - and how we're building computers out of the answer.
"If you think you understand quantum mechanics, you don't understand quantum mechanics." - Richard Feynman
We're going to throw around some terms. Here's what they actually mean - click each one to go deeper.
The smallest possible unit of something. Nature comes in chunks, not smooth flows.
+ expandA particle can be in multiple states simultaneously - until you look at it.
+ expandTwo particles linked so that measuring one instantly affects the other - anywhere in the universe.
+ expandFire particles one at a time through two slits. When nobody's watching, they create an interference pattern - as if each particle went through both slits. Add a detector to watch which slit it goes through, and the pattern disappears. The particle knew it was being observed.
A quantum computer isn't a faster classical computer. It's a fundamentally different kind of machine - like comparing a car to teleportation.
This is not the story of engineers building a better computer. It's the story of two old men arguing about whether God plays dice - and how that argument accidentally produced one of the most powerful computational tools in history.
Google and IBM use superconducting qubits - tiny circuits made from superconducting materials cooled to near absolute zero. The machines look like chandeliers designed by a supervillain.
A qubit starts in a definite ground state - 0. A precisely timed microwave pulse is fired at it, nudging it into an equal superposition of 0 and 1. This operation is called a Hadamard gate. The timing must be accurate to nanoseconds. Too long and it flips to 1. The sweet spot in the middle is superposition.
Two qubits are connected by a physical coupling element. A precisely timed sequence of microwave pulses causes them to interact - their quantum states become correlated. The operation is called a CNOT gate. After it, neither qubit has a definite state, but their states are locked together. Measure one and the other is instantly determined.
The algorithm is a carefully choreographed sequence of quantum gates. Through interference - the same phenomenon that makes noise-cancelling headphones work - wrong answers cancel each other out and the right answer is amplified. When measured, the correct result emerges with overwhelming probability.
"It's like having every possible answer written on a piece of paper, then waving them so the wrong ones cancel out and the right one glows. When you look - you see the glowing one."
Qubits are destroyed by heat, vibration, stray electromagnetic fields, and cosmic rays. This is called decoherence - the qubit's quantum state leaks into the environment. You have microseconds to perform your entire computation. IBM's largest systems must manage heat loads measured in nanowatts. The machines are the most fragile objects humanity has ever deliberately constructed.
"We have built the most sophisticated, expensive, and fragile thing humanity has ever constructed, and it loses its mind if someone nearby makes a cup of tea."
The most important milestone in quantum computing happened in 2023. Google published a paper demonstrating that as they increased the size of their error correction code, error rates decreased exponentially. This is called being below threshold.
More error-correction qubits → better error rates, not worse. The fundamental physics works. The remaining challenges are purely engineering. This is the Wright Brothers getting off the ground - not for long, not far, not carrying anything useful, but definitively proving the thing works in principle.
You will encounter vendors selling "quantum-proof firewalls," "quantum-safe VPNs," and "quantum-resistant endpoint solutions" with great urgency and fear.
The quantum computers that could threaten today's encryption do not yet exist. The timeline for cryptographically relevant quantum computers is 10–20 years at the optimistic end. If someone is selling you quantum-proof security today as an emergency - ask them: "quantum-proof against what, exactly?"
"It's like someone in 1985 selling you Y2K insurance. The problem is real - eventually. But if they're charging you a premium to solve it today, check their pockets."
The one legitimate exception: "Harvest now, decrypt later" attacks - collecting encrypted data today to decrypt once quantum computers arrive. For highly sensitive, long-lived secrets (government, defence, medical), post-quantum cryptography migration is genuinely worth planning. For your Tuesday meeting notes - probably not.
IBM has publicly committed to verified quantum advantage by 2026 and fault-tolerant quantum computing by 2029. Whether they hit those dates is an open question - but the fact that a trillion-dollar company is publishing concrete timelines tells you the engineering trajectory is real.
The first practically useful thing a quantum computer will probably do isn't break encryption or revolutionise AI. It'll discover a molecule.
Simulating molecular interactions that classical computers cannot model.
+ expandThe most transformative material discovery in human history - if we can find it.
+ expandThe two most transformative technologies of our era are converging.
+ expandWe are in the vacuum tube era of quantum computing. The trajectory is clear. The destination is not yet in sight. The people saying it'll change everything in 5 years and the people saying it's all hype are both wrong.
The wisest thing you can do is understand the shape of what's coming - not panic, not ignore it, and absolutely do not buy a quantum-proof firewall from someone who learned the word "qubit" last Tuesday.
Every few decades, a material discovery reshapes civilisation. Bronze. Iron. Steel. Silicon. But there is one material we haven't found yet that would make all previous discoveries look incremental. A superconductor that works at room temperature and ambient pressure - a material that conducts electricity with zero resistance under normal conditions - would be, by a wide margin, the most transformative discovery in the history of our species.
This isn't hyperbole. Let's walk through exactly what it would change.
Right now, the global power grid loses approximately 8–15% of all generated electricity as heat during transmission. That's hundreds of billions of kilowatt-hours - wasted. Every year. Burned off as heat in copper and aluminium wires because every conductor we have resists the flow of electrons.
A room-temperature superconductor turns every wire on Earth into a lossless channel. The energy we already generate becomes enough.
Electricity generated in the Sahara by solar farms could be transmitted to London, Tokyo, or São Paulo with zero loss. The constraint that power must be generated near where it's consumed vanishes overnight. Every developing nation gains access to cheap, abundant energy.
Superconductors enable magnetic levitation without continuous energy input. Maglev trains that float on permanent superconducting magnets - near-silent, frictionless, and operating at aircraft speeds. Intercity travel measured in minutes, not hours.
MRI machines currently cost £1–3 million because they require liquid helium cooling for their superconducting magnets. Remove the cooling requirement and the cost collapses by 90%. Every rural hospital, every developing nation, gets access to advanced diagnostic imaging.
The single biggest engineering bottleneck in nuclear fusion is containing plasma at 150 million degrees. This is done with superconducting magnets - currently cooled to near absolute zero at staggering expense. Room-temp superconductors make fusion reactors dramatically simpler, cheaper, and closer to commercial viability.
Every computer processor generates heat because electricity meets resistance. Remove resistance and processors can run faster, denser, and cooler. Quantum computers - which currently need to operate at −273°C - could potentially work at room temperature. The chandelier becomes a desktop.
Combine lossless power transmission with cheaper fusion and solar. Energy becomes so abundant and cheap that carbon capture, desalination, and direct air capture become economically viable at planetary scale. The economics of decarbonisation flip from impossible to inevitable.
Here's the connection. The reason we haven't found a room-temperature superconductor isn't that it can't exist - it's that predicting whether a material will superconduct requires simulating quantum interactions between thousands of electrons simultaneously. Classical computers can't do this. The possible configurations grow exponentially - a molecule with 100 electrons has more possible states than atoms in the universe.
A quantum computer simulates these interactions natively - using the same quantum rules the electrons themselves follow. This is exactly what Feynman proposed in 1981. He wasn't imagining a faster calculator. He was imagining a tool that could find the material that changes everything.
And this is where it gets truly extraordinary. A sufficiently powerful quantum computer discovers a room-temperature superconductor. That superconductor makes quantum computers cheaper, more reliable, and operable without cryogenic cooling. Better quantum computers discover better materials. Better materials make better quantum computers. A positive feedback loop between two civilisation-scale technologies, each accelerating the other.
This is not a linear trajectory. This is the ignition point for exponential progress in materials science, energy, medicine, and computation - simultaneously.
Here's the thought that keeps certain physicists (and quite a few game developers) staring at the ceiling at 3am. Superposition tells us that particles don't have definite properties until they're observed. The universe doesn't seem to bother resolving the state of something until something needs to know.
If that sounds familiar, it should. It's exactly how a modern game engine works.
In a video game like Red Dead Redemption 2 or Cyberpunk 2077, the game engine doesn't render the entire world at once. That would be absurdly expensive. Instead, it uses a technique called frustum culling and level-of-detail (LOD) rendering: only what the player's camera can see gets fully rendered. Objects behind you? They don't exist in detail. Turn around, and they pop into high resolution. Look away, and they simplify again.
The game only computes what's being observed.
Objects outside the player's view exist as simplified data - bounding boxes, low-poly placeholders. The moment the camera turns toward them, the engine loads full geometry, textures, and lighting in real time.
Why? Because computing everything everywhere all the time is computationally impossible. You optimise by only computing what's being observed.
A particle in superposition has no definite state - it exists as a probability cloud across all possible values. The moment a measurement is made (an "observer" interacts with it), the wavefunction collapses to a single definite outcome.
Why? Nobody knows. But the mathematical structure is strikingly similar: don't compute the result until something needs to read it.
The double-slit experiment is the most direct parallel. Fire an electron at two slits with no detector - it passes through both simultaneously and creates an interference pattern. Add a detector to observe which slit it goes through - the interference vanishes. The electron goes through one slit, like a normal object.
The universe appears to be doing exactly what a well-written game engine does: deferring expensive calculations until the result is actually needed. In computer science, this is called lazy evaluation. In quantum mechanics, they call it the measurement problem.
There's another strange coincidence. Reality has hard-coded limits. The Planck length (~1.6 × 10−35 metres) is the smallest meaningful distance. The Planck time (~5.4 × 10−44 seconds) is the smallest meaningful interval of time. Nothing can be smaller. These are, in a sense, the pixel size and frame rate of the universe - exactly the kind of hard limits you'd expect in a simulation running on finite hardware.
Even the speed of light starts to look like a maximum data transfer rate - a bandwidth cap baked into the engine.
This is philosophy, not physics. The simulation hypothesis is not falsifiable with current tools - which means it's not, strictly speaking, a scientific theory. The resemblance between quantum mechanics and computational optimisation might be a deep clue about the nature of reality, or it might be a coincidence shaped by the human tendency to see patterns everywhere. What it definitively is: one of the most interesting questions we can't yet answer.
"The universe is under no obligation to make sense to you." - Neil deGrasse Tyson. But it is suspiciously well-optimised.
Google Quantum AI's beginner-friendly guide to qubits, superposition, and why quantum computers are fundamentally different.
quantumai.google →Verified quantum advantage by 2026. Fault-tolerant quantum computing by 2029. IBM's public timeline and what it means.
forbes.com →MIT Technology Review on Google's Bristlecone chip, the race for quantum supremacy, and what it actually means for computing.
technologyreview.com →