Google just issued a warning that has great implications for the cybersecurity world: “Q-Day” — the moment when a quantum computer becomes powerful enough to crack today’s best encryption — could arrive as soon as 2029. That’s not the mid-2030s timeline most experts had been citing. That’s three years from now. Google Quantum AI also released a new whitepaper indicating that not even elliptic curve cryptography is safe from quantum computers of the future.
Recent research from Bain & Company found that 90% of organizations don’t yet have systems in place to defend against quantum security threats, despite the fact that 71% of those same organizations expect quantum-enabled attacks within five years. Only one in ten had a roadmap in place. Most were waiting to see what happens, hoping someone else solves the problem first.
The combination of a dramatically compressed timeline and a near-total absence of organizational readiness is what makes this moment so consequential for enterprise computing C-suite. Because the uncomfortable truth is that your organization’s encrypted data may already have been stolen, stored patiently, cheaply, and indefinitely, waiting for the moment it can be cracked open like a safe whose combination has finally been figured out.
This threat has a name: harvest now, decrypt later. And if your organization handles any information that needs to remain confidential for the next five to ten years you are already exposed. Everything from patient records and accounting data to financial transactions, legal communications, intellectual property, strategic plans… it’s all potentially exposed.
To understand why this matters, you need to understand, at least conceptually, how today’s internet security works.
Almost everything we do online is protected by encryption that relies on extraordinarily difficult math problems. The security of your bank, your email, your cloud systems, your VPN — all of it depends on the fact that solving problems like factoring a very large number into its prime components, or finding a position on a mathematical curve function is, for a conventional computer, practically impossible. These are the locks. And for decades, they have held.
Quantum computers break these locks. They operate on fundamentally different principles, exploiting the unique behavior of quantum physics to solve certain mathematical problems exponentially faster than any classical machine. One of the earliest algorithms ever written for quantum computers is Shor’s algorithm, developed in 1994, and it targets precisely the mathematical problems that underlie RSA encryption and the elliptic curve functions, which protect your banking system, your medical records and your communications infrastructure.
The quantum computers powerful enough to exploit this don’t fully exist yet. In my view as someone with experience in software security, the trajectory is moving faster than most people expected, much like AI did. Nation-states, particularly China, are investing enormous sums to get there first. And here’s the critical insight everyone needs to internalize: adversaries don’t need a quantum computer today to begin exploiting this vulnerability. They just need cheap storage and patience.
They are collecting your encrypted data now, warehousing it, and waiting. When quantum capability arrives, everything harvested today becomes readable plain text. Imagine the entire internet reverting to the 1990s, when Telnet transmitted everything in the clear and anyone on the network could read your communications as they passed over the wire. That is the future being prepared for, unless we act.
The good news is that cryptographers have been working on this problem for nearly a decade, and the solution exists. It is called Post-Quantum Cryptography, or PQC, and it represents a new generation of encryption algorithms designed to be secure against both conventional and quantum computers.
Instead of basing security on prime number factoring that quantum computers are good at breaking, PQC uses entirely different mathematical structures like lattice-based geometry, hash functions, and error-correcting codes that resist quantum attacks. Think of it as replacing every lock on the internet with a completely different style of lock, one that gives quantum computers no advantage over conventional computers in breaking. The quantum computer’s tools simply aren’t designed to pick.
In 2024, after an eight-year global competition that invited the world’s cryptographers to submit their best candidates and then try to break each other’s work, the US National Institute of Standards and Technology (NIST) finalized the first official PQC standards. Three algorithms are now approved: ML-KEM (FIPS 203) for securing key exchanges, ML-DSA (FIPS 204) for digital signatures, and SLH-DSA (FIPS 205) as a hash-based backup. Major technology companies (Apple, Google, Cloudflare, Signal and my own company, CIQ, to name a few) have already begun integrating them.
NIST, the CyberSecurity & Infrastructure Security Agency (CISA) and the U.K.’s National Cyber Security Center (NCSC) have all published a migration timeline that every organization should be planning against: active migration of sensitive systems by 2030, with full deprecation of vulnerable legacy algorithms by 2035. That 2030 date is rapidly becoming the regulatory forcing function across federal agencies, financial services, healthcare, and defense. The decisions your organization makes in 2026 determine what you will actually be able to deploy in 2028 and 2029. Budget cycles, procurement timelines, and infrastructure refresh schedules mean the runway is shorter than it looks, and as Google has warned, it is getting shorter every day.
PQC migration cannot be delegated to the security team and forgotten. It is a business continuity, regulatory, and strategic risk issue that must get the attention of corporate leadership as a matter of urgency.
Consider the scope. Every organization that uses digital technology has what security professionals call cryptographic assets: every place where encryption is used to protect data, verify identity, or establish trust. Encrypted databases, cloud storage, authentication systems, digital certificates, payment terminals, medical devices, and industrial control systems all qualify. The average large enterprise has thousands of these assets spread across systems, vendors, applications, and hardware, many of which are invisible to central IT.
The first step for every organization is a cryptographic inventory: a systematic effort to find every algorithm, identify where it is used, and determine if it is safe to use in a post-quantum world. Most organizations will be surprised by what they find. Legacy systems running deprecated algorithms nobody knew about. Third-party dependencies with no PQC roadmap. Embedded hardware that cannot be patched and must be physically replaced.
I know this firsthand. My own recent work involved taking PQC algorithm code that was mathematically correct and fully functional, then spending months on the unglamorous engineering work required to make it pass certification so it is actually deployable in regulated environments. That meant hunting down intermediate copies of sensitive data left in memory by an automated code conversion process, writing self-test routines that prove the algorithm behaves correctly on every load, and adding key validation procedures that produce no visible features and impress nobody at a demo. That work is representative of what the migration requires at every layer of the stack. Working code and deployable, compliant code are very different things, and the distance between them is measured in months of painstaking engineering.
Hardware deserves special attention in this context. Inexpensive IoT devices like connected sensors, cameras, and industrial controllers often contain software implementations that cannot be upgraded. They must be physically replaced. For organizations with large operational technology estates, this is potentially the most expensive and time-consuming element of the entire migration, and it requires capital budget decisions that need to start now.
Every organization sits somewhere on an urgency spectrum. At the highest end are defense contractors, financial institutions, healthcare systems, and critical infrastructure operators. Here, the threat is existential, and regulatory pressure is already arriving in RFP language, audit questionnaires, and compliance frameworks.
The National Security Agency’s Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) sets formal transition milestones beginning in 2027, with full migration targeted by 2035, specifically for National Security Systems. The underlying logic of those timelines applies equally to any organization handling sensitive long-lived data, regardless of whether they are subject to national security classification requirements.
Mid-market companies face a different but equally real pressure. Their enterprise customers will increasingly require demonstrated PQC readiness as a condition of doing business, and vendor questionnaires are already appearing. Supply chain cryptographic risk flows downstream, and mid-market organizations sitting in regulated supply chains should treat their customers’ compliance timelines as their own. A 2022 Linux Foundation study found that 70-90% of any given software code base is made up of open source components, many of which are implementations of cryptographic algorithms. Unless your organization has experience in managing and patching the open source code used in this software, you may inadvertently be using non-compliant code.
Smaller organizations running standard cloud platforms will find that much of their migration happens automatically as vendors upgrade their infrastructure. Their primary obligation is to stay current and ensure nothing in their environment relies on outdated, unsupported systems.
As for cost: industry analysts estimate that full PQC migration programs for large enterprises will run into the tens to hundreds of millions of dollars over five to ten years, depending heavily on legacy system complexity. Mid-market organizations should plan for a materially smaller but still significant investment. The single most important cost variable is timing. The cryptographic expertise required to execute these migrations is already scarce, and demand will intensify sharply as regulatory deadlines approach. Organizations that begin now will pay significantly less in consulting fees, implementation corrections, and regulatory exposure than those that wait until urgency forces their hand.
There is one final dimension of this issue that deserves executives’ attention, because it reframes PQC from a compliance exercise into a strategic risk.
The race to quantum computing capability is a geopolitical competition with asymmetric stakes. The nation or bloc that achieves cryptographically relevant quantum computing first gains an intelligence and economic advantage with no historical parallel: the ability to retroactively read the encrypted communications of governments, militaries, financial systems, and corporations going back years or decades. Western intelligence agencies, including the NSA and CISA, along with their Five Eyes partners, have publicly assessed that adversarial nation-states are actively conducting harvest-now-decrypt-later operations against high-value targets today. This is stated, on-the-record intelligence assessment, and leadership should treat it accordingly.
The right question management needs to ask is: “If a credible quantum breakthrough were announced tomorrow, what is our exposure, and what is our response plan?”
The internet is in the middle of the largest planned security transition in its history. The standards exist. The timeline is published. The regulatory direction is clear. What remains is execution and the organizational will to begin.
For executives, the immediate priorities are straightforward:
Google’s revised Q-Day estimate of 2029 should serve as a wake-up call. The organizations that respond to it with urgency and discipline will complete their migrations with time to spare. Those that wait will find themselves scrambling against compressed timelines, tighter talent markets, and regulators with limited patience for organizations that had ample warning.
The algorithms are being replaced. The only question is whether your organization will lead that transition or be caught unprepared when quantum computers break the existing security we have all depended on until now.