Although World Quantum Day has now passed, it remains a valuable moment to pause and reflect on recent developments and shifting market dynamics.
In just the past few months, we have seen several SPAC announcements, advancements in error-correction theory, use-case demonstrations, and announcements of new government programs.
Co founder and CEO of IQM Quantum Computers.
Developing technology
Overall, the sector has developed the technology, production capabilities, and business models to support a growing demand for quantum computers. In these early days of the industry, true ownership and ecosystem models will be the key driver towards quantum advantage.
This is why the choices being made right now, by institutions, by governments, and by companies like ours, will determine whether quantum computing fulfills its actual potential or just produces more years of impressive demonstrations.
Every transformative technology goes through the same challenging middle phase. The technology works in science labs. The papers are impressive. The product roadmap look credible. But the gap between what is demonstrated and what is deployed stays wide open.
We have been in that phase with quantum computing for some years. And the honest reason it persists is not only the hardware or the algorithms. The hardware is improving and the algorithms are getting more efficient. A strong reason is that the industry has not yet agreed on what it means for quantum advantage to actually arrive.
We have been too focused on qubit count, fidelities, and connectivity and ignored main questions like adoption models, ownership, and deployment. If customers want to solve real-world problems, they must be able to integrate the new technology into their existing workflows and technology stacks.
This is more important than achieving a certain qubit milestone. But not each business model allows you to fully integrate a new technology into your workflow.
The solution is not only the technology. It is the model.
Cloud access
A widely used commercial model in quantum computing today is cloud services access. You pay to run jobs on hardware you do not own, accessed remotely, at a cost that scales with usage. This model serves a real purpose, enabling experimentation, algorithm testing, and early adoption without capital commitment. That is genuinely valuable and we support it because it lowers the entry barrier for early quantum adoption. It is also essential for quantum education, a topic that is close to my heart.
But cloud access is an entry point, not a destination. And this is not just an argument about where quantum computing is today. It is an argument about where it is going. Even as the technology matures, even as fault-tolerant systems arrive, most serious institutions will want to run their most sensitive and strategically important workloads on IT infrastructure they control themselves. Intellectual property, data security, regulatory compliance, sovereign capability: these considerations do not disappear as the hardware gets better. If anything, they become more pressing as quantum becomes more powerful.
Think about how the internet scaled. The protocols that built it, TCP/IP most importantly, were open standards. Not proprietary. Not controlled by any single vendor. That openness created the conditions for an ecosystem where anyone could build on top of it and no single company could capture all the value or determine who was allowed to participate. The internet did not succeed because one company rented access to the network. It succeeded because the network became infrastructure that institutions could own, operate, and build upon independently.
Quantum computing is approaching the same fork in the road. The question being decided right now, in procurement decisions at national labs, supercomputing centers, and, most importantly, enterprise customers, is whether quantum follows the internet model or a different one entirely.
Who pays the price for slow adoption? Everyone!
The importance of ownership
When institutions stay only with cloud access and never move to ownership, several things will not happen: They do not build internal expertise. They do not develop the operational capability to run workloads at low latency. They do not generate the feedback loop between hardware and application that drives real progress. And they remain dependent on a vendor’s uptime, pricing, and continued interest in serving them.
The result is that quantum adoption looks wide but runs shallow. Many organizations have run experiments. Very few have built capabilities. And the cost of that gap is not abstract. Drug discovery timelines stay longer than they need to be. Supply chain optimization remains approximate rather than precise. Energy grid modeling stays computationally limited at exactly the moment when the energy transition demands more from it. These are not future problems. They are present ones, and quantum computing is one of the most credible tools we have for addressing them at scale. Every year that adoption stalls is a year those problems compound.
This pattern is known well from other deep tech cycles. Early industrial computing ran on time-sharing models. You booked time on a mainframe you did not own. It worked, until organizations realized that owning compute gave them fundamentally different capabilities: control over their data, ability to customize, and the compounding advantage of building institutional knowledge over time. The transition from time-sharing to owned infrastructure was not just a procurement decision. It was what made computing a real industry.
We believe quantum is at that inflection point.
Own the machine. Own the outcome. Build an ecosystem.
We believe in building full-stack, quantum computers for supercomputing environments. On-premises quantum systems, where the hardware sits in the customer’s facility, runs on their infrastructure, and operates under their control, enable quantum computing to become a permanent, operational part of an institution’s technical capability. We call this Production Quantum. Because it reflects not a remote service you subscribe to, but infrastructure you command, with the security, latency, and sovereign control that serious institutions require, today and long after the technology reaches its full potential.
Full-stack matters here. Customers need one partner, with one accountability structure, one support relationship, and one roadmap conversation.
Vertical integration also means supply chain resilience. Quantum hardware depends on specialized components, specialized fabrication processes, and materials that are not widely available. Companies that rely on external chip suppliers inherit that supplier’s bottlenecks, lead times, and strategic priorities. This means chip fabrication ownership is important too. It gives customers a more predictable path to delivery, upgrade, and long-term system evolution, which matters enormously when they are building a capability rather than running an experiment.
What’s also important is co-designing systems with customers from the beginning, aligning the hardware architecture with the actual workloads they need to run. These systems empower customers to build their own solutions and ecosystems around them.
A rising tide lifts all boats. We intend to be the tide.
The second thing we believe, just as strongly, is that no single company can build the quantum industry alone. The question is whether the companies at the center of the field act like platforms or gatekeepers. We have made a clear choice to provide platforms and enable ecosystems around us and around our customers.
We believe the quantum ecosystem grows faster when its foundations are shared, and that a broader ecosystem ultimately creates more demand for what we build, not less. The evidence supports this. In Finland, the quantum ecosystem grew from one company in 2018 to eleven by 2024. External funding attached to the ecosystem grew from zero to hundreds of millions of dollars over the same period.
In Bavaria, a similar pattern has emerged: more companies, more employees, more capital, more activity than in comparable regions where quantum development has stayed more closed. On-premises quantum systems act as seeds. They attract software developers, algorithm researchers, and application companies. They train the engineers the industry needs. They create the feedback loops that accelerate hardware development.
Open standards and tooling amplify adoption by enabling hands-on use, capability building, and full control over quantum infrastructure. This is how quantum computing moves from specialized research to general-purpose industrial capability.
What this means for humanity, not just the industry
I am a physicist. I came to entrepreneurship through science, not through business school. And I think my scientific background shapes how I look at what we are building and why it matters beyond the commercial case.
There are problems that classical computers will never solve. Not because we have not tried hard enough, but because the computational complexity of certain problems scales in ways that make them structurally intractable on classical hardware.
Molecular simulation for drug discovery. Optimization at industrial scale in logistics, energy, and finance. Materials design for next-generation batteries and semiconductors. These are problems where meaningful progress could reduce human suffering, lower the cost of energy, and accelerate scientific discovery in ways that compound over decades.
Quantum computing is the most credible path to making those problems tractable. The mathematics is clear. What remains hard is the engineering, the deployment, and the ecosystem development required to turn that mathematical potential into operational reality. Production Quantum is not just the right commercial strategy. It is also the model most likely to produce the breadth of deployment that makes those broader applications possible.
There is also a talent dimension that does not get enough attention. There is a shortage of quantum physicists and engineers today. Real hardware in institutions is what educates the next generation. You cannot build a quantum workforce on cloud access alone. You need systems in labs, in universities, in national facilities, operated by people who develop genuine hands-on expertise. Every system we deliver is, in that sense, also an investment in the human capital the field needs to fulfill its potential.
What we have built. What comes next.
Quantum hardware is still technically fragile. The road to fault-tolerant systems at scale is long. I have no interest in pretending otherwise.
But I am confident in the direction. The quantum era does not begin when the technology works perfectly in a lab. It begins when institutions own it, operate it, and build on it. That is what production quantum means, and that is the progress worth measuring.
Want to learn more about quantum computing? We’ve listed the best online courses and online class sites.
This article was produced as part of TechRadar Pro Perspectives, our channel to feature the best and brightest minds in the technology industry today.
The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/pro/perspectives-how-to-submit
