Is Quantum Computing a Bubble?

Tens of billions of public and private dollars are betting on a claim about reality that has never been verified to exist.

The Quantum Computing Bubble

There’s a question nobody in physics seems to ask:

Have you ever experienced any moment other than right now?

Not remembered the past. Not imagined the future. Actually been there.

No.

And neither has any instrument, experiment, or observation in the entire history of physics. Every measurement that has ever occurred has occurred now. Whatever “the past” is, it’s a record. Whatever “the future” is, it’s a set of constrained possibilities. The present is the only thing science has ever directly verified.

And yet tens of billions of dollars, public and private, are currently betting on a claim about reality that has never been directly verified:
that physical systems literally exist in multiple states at once until something causes them to collapse into a measurable result.

Read that again.

Governments and companies like Google, IBM, and Microsoft are pouring money into machines whose foundational promise depends on a metaphysical claim about reality that physics has never settled, cannot currently test, and has never verified is even possible.

The machines look very impressive. Lots of tubes. Very cold. Extremely expensive.

What They Promise

The pitch goes like this: classical computers use bits: ones and zeros. Quantum computers use qubits that can be one and zero simultaneously. This superposition supposedly lets them explore all possible solutions at once, harnessing computational leverage classical computers can’t match.

Cryptography cracked. Drug discovery transformed. AI achieving unprecedented intelligence, all thanks to exploiting this amazing property of quantum weirdness.

Working physicists may hedge on the metaphysics. But the investment decks don’t. The popular narrative, the one in press releases, TED talks, and funding pitches sells a revolutionary break from classical computing grounded in the idea that superposition is a real physical state.

It sounds like science fiction because it is built on a premise that has never been empirically verified to exist.

The Problem Is Deeper Than You Think

Most critiques of quantum computing hype focus on the Many Worlds Interpretation (MWI), the idea that every quantum event spawns a new universe. That’s an easy target. It’s unfalsifiable by design. You can’t visit these other universes. You can’t even detect them.

But the MWI isn’t the real problem.

The real problem is an assumption that every mainstream interpretation of quantum mechanics quietly shares:

Superposition is treated as physically real.

The Copenhagen Interpretation says particles literally exist in multiple states until measurement collapses the wave function. MWI says they exist in multiple states and never collapse because every possibility plays out somewhere else. Decoherence says environmental interaction gradually destroys the superposition.

They disagree about what happens to superposition.
But they all base their theories on it being physically there.

And that matters, because the entire quantum computing value proposition rests on it.

If a qubit is physically in both states simultaneously, you can argue for computational parallelism. If a qubit is not physically in both states, if superposition is a mathematical description of a set of possibilities rather than a real physical multiplicity, then the scaling promises built on that story needs to be reexamined.

The Assumption Nobody Examines

There’s a simpler question physics has been avoiding:

What if only the present moment actually exists?

Your breakfast isn’t still sitting somewhere in the past, uneaten. The energy it contained has been transformed. The past is not a place where things still exist, just as you can’t jump in a photo and relive it. The future is not a place where things are just waiting for their moment to become. Only now is where anything becomes actual.

This isn’t mysticism. It’s the most empirically grounded statement available. Every observation happens now. Every experiment happens now. Every measurement happens now.

Superposition is not particles existing in multiple physical states. The wave function is not a physical thing spread across reality. It is a mathematical description of constrained possibility: a lawful landscape of what could actualize next.

A weather forecast that says “60% chance of rain” doesn’t mean it is already 60% raining. It means the system’s constraints make rain more likely than not. When tomorrow arrives, it either rains or it doesn’t. The forecast was never a physical state; it was math describing constrained potential.

When measurement occurs, one possibility actualizes. Not because parallel universes branch, not because observation collapses reality, not because we destroyed a physical superposition, but because that is what the present does:

It is where potential becomes actual.

No collapse required. There was never a physical superposition waiting to collapse. There is only reality continuously actualizing one present moment to the next.

DPΦ in One Paragraph

Dynamic Present Theory (DPΦ) takes the primacy of the present seriously and formalizes it.

Reality actualizes through Continuous Present Actualization (CPA), governed by the Principle of Efficient Actualization (PEA): actualization follows the path of least informational cost under constraint.

The CPA + C Rate Law (where C = Constraint Load) quantifies how the local rate and stability of coherence varies with energy density and informational constraint.

A Framework With Receipts

This might sound like philosophy. It’s not. It’s a physics framework with four independent domains of empirical support, each showing the same functional form.

The same equation. Different substrates. Verified results:

  • Glass Transition
    For a century, the Vogel–Fulcher–Tammann equation has described how liquids become glass, but nobody knew why it worked. The CPA + C rate law derives it from first principles, with independently verified statistical parity (e.g., R² > 0.99 across ~14 orders of magnitude).

  • AI Coherence (LLMs under constraint)
    Large language models exhibit entropy reduction following the identical mathematical form. As constraints increase, uncertainty decreases following the same equation. Correlation: r = 0.957. Independently verified across 240,313 data points.

  • Cosmology
    Joint analyses across benchmark datasets shows a >10σ preference for the DPΦ framework over standard cosmology. The Hubble tension, one of physics’ most significant, unresolved crises, naturally resolves. The cosmological constant emerges from first principles, preferring the CPA+C Rate Law 7.18σ over dark energy

  • Number Theory
    Even in pure mathematics, the energy trajectory of the Chowla Cosine problem follows the same CPA + C form with significant precision (R² = 0.973).

When one equation predicts behavior in glass, language models, cosmology, and pure mathematics, coincidence stops being a satisfactory explanation.

What This Means for Quantum Computing

To be clear: quantum computers are real. They exploit real quantum effects: interference patterns, correlations, coherence phenomena. Nobody disputes that.

What’s disputed is the interpretation, and interpretation determines what you believe is possible at scale.

Mainstream narratives assume these effects are powered by manipulating physical superpositions: states that are really there in multiple forms simultaneously.

DPΦ says something different: these effects come from manipulating the structure of constrained potential before actualization occurs. That single distinction predicts radically different scaling limits. And it explains the industry’s most persistent problem.

Why Decoherence Won’t Be “Solved”

For decades, decoherence has been treated as the engineering obstacle standing between today’s quantum prototypes and a scalable revolution. The assumption is:

  • superposition is the natural physical state, and

  • decoherence is noise that can eventually be engineered away.

They keep telling us that with enough engineering, and many more billions of dollars spent on better isolation, colder temperatures, and smarter error correction, coherence can eventually be maintained on a massive scale.

But if DPΦ is right, decoherence isn’t simply noise.

It is reality doing what reality does: turning potential into actual, continuously, everywhere.

What quantum engineers call decoherence is simply Continuous Present Actualization (CPA) asserting itself.

The CPA + C rate law predicts that maintaining coherence has genuine informational costs that scale with system complexity. Not merely because of engineering limits, but because of physics.

You’re not fighting noise. You’re fighting the universe’s default behavior.

That’s like spending thirty years trying to prevent ice from melting by building better freezers, when the real problem is that you built the lab inside an active volcano.

This is why, decades into the quantum computing revolution, practical large-scale applications remain perpetually on the horizon. It may not be an engineering problem. It may be that the premise being sold, that physical superposition as scalable computation, was wrong from the start.

If coherence is structured potential rather than an ontic physical state, then there are hard limits that no amount of cooling, shielding, or error correction can overcome. The cost of constraint rises with the scale of the system you’re trying to keep in potential. Not linearly, exponentially.

What We Actually Need

Here’s the uncomfortable truth: while billions flow into quantum computing based on unexamined metaphysics, more practical advances are receiving less attention than they deserve:

  • Photonic computing (using light instead of electrons) offers massive efficiency gains without requiring temperatures colder than outer space.

  • Neuromorphic chips that mimic the neural architectures that already work in nature.

  • Advanced classical algorithms are already closing the gap on many problems quantum machines were supposed to dominate.

The hype machine favors the exotic over the practical. “Quantum supremacy” makes a better headline than “efficient constraint-based computation.”

If DPΦ is right, the path forward isn’t harnessing imaginary parallel states. It’s building systems that work with the physics of continuous actualization rather than fruitlessly fighting against it.

The Invitation

Over $54 billion in government commitments. Billions more in private capital. Stock surges of thousands of percent. A bubble forms when money prices a captivating story as fact.

Quantum computing’s story of physical superposition as computation is not verified as a physical description of reality. It’s an interpretation. A profitable one. A seductive one. But still an interpretation.

Meanwhile, a framework that treats the most obvious fact seriously, that only now is measurably or observably real, predicts consistent behavior across multiple substrates with a single shared equation.

You don’t have to take anyone’s word for it. The papers and datasets are publicly available. The math is testable. And the present keeps doing what it has always done:

Not branching. Not waiting. Not collapsing.

Becoming.

DynamicPresentTheory.com
The math abides.

Next
Next

Physics Has Been Faking It for 100 Years