There are often exaggerated timelines for the arrival of quantum computers that pose a real threat to existing cryptographic systems, leading to calls for an immediate, large-scale migration to post-quantum cryptography.
However, these calls often overlook the costs and risks of premature migration, and also ignore the fact that different cryptographic primitives face completely different risk profiles:
Even though post-quantum encryption is costly, it must be deployed immediately: Harvest-now-decrypt-later (HNDL) attacks are already underway because sensitive data protected by encryption today will still be valuable when quantum computers truly arrive, even decades from now. Despite the performance overhead and implementation risks of post-quantum encryption, HNDL attacks mean there is no alternative for data requiring long-term confidentiality.
The considerations for post-quantum signatures are entirely different. They are not affected by HNDL attacks, and their costs and risks (larger size, performance overhead, immature implementation, and potential vulnerabilities) mean that migration should be proceeded cautiously rather than implemented immediately.
These distinctions are crucial. Misconceptions can distort cost-benefit analyses, causing teams to overlook more critical security risks—such as the vulnerabilities themselves.
The real challenge in successfully moving towards post-quantum cryptography is aligning "urgency" with "real threats." Below, I will clarify common misconceptions about the quantum threat and its implications for cryptography—including encryption, signatures, and zero-knowledge proofs—with a particular focus on the impact of these issues on blockchain.
What stage of time are we at right now?
The likelihood of a “quantum computer that poses a real threat to cryptography (CRQC)” emerging in the 2020s is extremely low, despite some high-profile claims that have attracted attention.
P.S.: Quantum computers that pose a real threat to cryptography will be referred to as CRQC in the following text.
The “quantum computer that poses a real threat to cryptography” mentioned here refers to a fault-tolerant, error-corrected quantum computer capable of running Shor’s algorithm at a sufficient scale to attack elliptic curve cryptography or RSA within a reasonable timeframe (e.g., breaking secp256k1 or RSA-2048 within a maximum of one month of continuous computation).
Based on publicly available milestones and resource assessments, we are still a long way from achieving this kind of quantum computer. Although some companies claim that CRQC is likely to appear before 2030 or even 2035, publicly available progress does not support these claims.
In the context of current architectures—ion traps, superconducting qubits, and neutral atom systems—no quantum computing platform comes close to the hundreds of thousands to millions of physical qubits (the exact number depends on the error rate and error correction scheme) required to run Shor's algorithm to attack RSA-2048 or secp256k1.
The limiting factors are not only the number of qubits, but also gate fidelity, qubit connectivity, and the depth of the error-correcting circuitry necessary for executing deep quantum algorithms. While some systems now have more than 1,000 physical qubits, it is misleading to look at the number alone: these systems lack the connectivity and gate fidelity required to perform cryptographic computations.
While recent systems have come close to the physical error levels that make quantum error correction feasible, no one has yet demonstrated more than a few logical qubits with sustainable error-correcting circuit depth—let alone the thousands of high-fidelity, deep-circuit, fault-tolerant logical qubits actually required to run Shor's algorithm. A huge gap remains between theoretically proving the feasibility of quantum error correction and actually reaching the scale required for cryptographic breaking.
In short: unless the number of qubits and fidelity increase by several orders of magnitude at the same time, a "quantum computer that poses a real threat to cryptography" remains a distant dream.
However, corporate press releases and media reports are highly susceptible to misunderstanding. Common misconceptions include:
Demonstrations claiming to achieve "quantum advantage" often target artificially constructed problems. These problems are not chosen for their practicality, but rather because they can run on existing hardware while seemingly exhibiting significant quantum speedups—a point often deliberately downplayed in the publicity.
The company claims to have achieved thousands of physical qubits. However, this typically refers to a quantum annealing machine, not a gate-model quantum computer required to run Shor's algorithm to attack public-key cryptography.
The company's misuse of the concept of "logical qubits." Physical qubits are inherently noisy, and quantum algorithms require logical qubits; as mentioned earlier, Shor's algorithm requires thousands of logical qubits. Using quantum error correction, a single logical qubit typically consists of hundreds to thousands of physical qubits (depending on the error rate). However, some companies have absurdly misused the term. For example, one company recently claimed to have achieved 48 logical qubits using only two physical qubits per logical qubit through a distance-2 encoding. This is clearly illogical: a distance-2 encoding can only detect errors, not correct them. Fault-tolerant logical qubits used for actual cryptographic cracking require hundreds to thousands of physical qubits each, not just two.
More generally, many quantum computing roadmaps use "logical qubit" to refer to qubits that only support Clifford operations. These operations can be efficiently simulated by classical algorithms and are therefore insufficient to run Shor's algorithm, which requires thousands of error-corrected T gates (or more generally non-Clifford gates).
Therefore, even if a roadmap claims to "reach thousands of logical qubits in a certain year X", it does not mean that the company expects to be able to run Shor's algorithm to break classical cryptography in the same year X.
These practices have severely distorted the public's (and even industry professionals') perception of "how close we are to a true CRQC".
Nevertheless, some experts are indeed excited about the progress. For example, Scott Aaronson recently wrote that, given "the astonishing speed of current hardware development," he now believes it is a real possibility that we will have a fault-tolerant quantum computer running Shor's algorithm before the next US presidential election.
However, Aaronson later clarified that his statement did not imply a cryptographically capable quantum computer: even if a fully fault-tolerant Shor's algorithm only successfully factored 15 = 3 × 5—a number you could calculate much faster with pen and paper—he would still consider his point met. The standard here is still only a microscale execution of Shor's algorithm, not a cryptographically significant one; previous quantum factorizations of 15 used simplified circuits, not a fully fault-tolerant Shor's algorithm. Furthermore, the continued choice of factoring 15 in quantum experiments is not accidental: because arithmetic calculations modulo 15 are extremely simple, while factoring slightly larger numbers (such as 21) are much more difficult. Therefore, some quantum experiments claiming to factor 21 often rely on hints or shortcuts.
In short, there is no publicly available evidence to support the expectation that a quantum computer capable of breaking RSA-2048 or secp256k1 will emerge within the next 5 years (which is what cryptography is really concerned with).
Even 10 years is still an aggressive prediction. Considering how far we are from truly cryptographic quantum computers, it is entirely possible to coexist with a timeline of more than ten years, even while remaining excited about the progress.
So what does it mean that the US government has set 2035 as the target year for migrating its entire government system to a post-quantum cryptography system? I believe this is a reasonable timeline for completing such a large-scale migration. However, this is not a prediction that "CRQC will emerge at that time."
In which scenarios is HNDL attack applicable (and in which scenarios is it not applicable)?
A "Harvest Now, Decrypt Later" (HNDL) attack refers to an attacker storing all encrypted communications now, waiting to decrypt them someday when a quantum computer, posing a real threat to cryptography, emerges. It's certain that state-sponsored attackers are already archiving encrypted communications of the US government on a large scale, preparing to decrypt them when quantum computers become available. This is why encryption systems must begin migrating today—at least for those entities that need to maintain confidentiality for 10–50 years or more.
But digital signatures—the technology upon which all blockchains rely—are different from encryption: they lack the "confidentiality" that can be compromised after the fact.
In other words, while the arrival of quantum computers will indeed make it possible to forge digital signatures from that moment on, past signatures do not "hide" any secrets like encrypted messages. As long as it can be confirmed that a digital signature was generated before the advent of CRQC, then it cannot be forged.
Therefore, the migration to backward quantum digital signatures is not as urgent as that of encryption systems.
This is reflected in the actions of major platforms: Chrome and Cloudflare have deployed hybrid X25519+ML-KEM in Web Transport Layer Security (TLS) encryption. [In this article, I will refer to these as "encryption schemes" for readability, although strictly speaking, secure communication protocols such as TLS use key exchange or key encapsulation mechanisms, not public-key encryption.]
The term "hybrid" here means simultaneously using a post-quantum security solution (ML-KEM) and an existing solution (X25519) to achieve the security of both. This approach aims to prevent HNDL attacks through ML-KEM, while providing traditional security guarantees through X25519 should ML-KEM prove insecure even for current computers.
Apple's iMessage also deploys a similar hybrid post-quantum encryption in its PQ3 protocol, and Signal has implemented this mechanism in its PQXDH and SPQR protocols.
In contrast, the migration of critical web infrastructure to post-quantum digital signatures will be postponed until "truly close to the advent of CRQC" because current post-quantum signature schemes introduce significant performance degradation (discussed later in this article).
zkSNARKs—zero-knowledge, concise, non-interactive knowledge proofs—are central to the future scalability and privacy of blockchains—and are similar to digital signatures in terms of quantum threats. This is because even if some zkSNARKs themselves do not possess post-quantum security (because they use the same elliptic curve cryptography as current encryption and signatures), their "zero-knowledge" nature still makes them post-quantum secure.
The zero-knowledge property guarantees that no information about the secret witness will be leaked—even in the face of a quantum attacker—therefore there is no confidential information that can be "collected" in advance and decrypted in the future.
Therefore, zkSNARKs are unaffected by HNDL attacks. Just as non-post-quantum digital signatures generated today are secure, a zkSNARK proof is credible (i.e., the statements in the proof must be true) as long as it was generated before CRQC—even if the zkSNARK uses elliptic curve cryptography. Only after the advent of CRQC could an attacker construct a proof that "appears valid but is actually flawed."
What does this mean for blockchain?
Most blockchains are not vulnerable to HNDL attacks: Most non-privacy chains—such as Bitcoin and Ethereum today—primarily use non-post-quantum cryptography in transaction authorization, meaning they use digital signatures rather than encryption.
To reiterate, digital signatures are not susceptible to HNDL attacks: "collect first, decrypt later" attacks only apply to encrypted data. For example, the Bitcoin blockchain is public; the quantum threat lies in forging signatures (deriving private keys to steal funds), not in decrypting publicly available transaction data. This means that HNDL attacks do not present an immediate cryptographic urgency for current blockchains.
Unfortunately, some credible institutions (including the Federal Reserve) still erroneously claim in their analyses that Bitcoin is vulnerable to HNDL attacks, an error that exaggerates the urgency of a backward quantum cryptography migration.
However, this "reduced urgency" does not mean Bitcoin can wait indefinitely: Bitcoin faces different time pressures due to the enormous social coordination required for protocol upgrades. (Bitcoin's unique challenges will be discussed in more detail below.)
One current exception is privacy chains, many of which hide the recipient and amount through encryption or other means. This type of confidential information can be "collected" in advance and potentially deanonymized afterward once a quantum computer can break elliptic curve cryptography.
For such privacy chains, the severity of an attack varies depending on the chain's design. For example, with Monero's elliptic curve-based ring signatures and key images (a unique, chainable tag for each output used to prevent double-spending), the public ledger alone could be sufficient to reconstruct the entire transaction flow graph in the future. However, in other privacy chains, the extent of damage would be more limited—see the discussion by Sean Bowe, a cryptographic engineer and researcher at Zcash.
If users consider it crucial that "transactions will not be exposed by the advent of quantum computers in the future," then privacy chains should migrate to post-quantum cryptography primitives (or hybrid schemes) as soon as possible. Alternatively, they should adopt an architecture that completely avoids placing decryptable secrets on the chain.
Bitcoin's unique challenges: governance mechanisms + abandoned coins
For Bitcoin, two realities make the migration to post-quantum digital signatures urgent, and neither has anything to do with quantum technology itself. The first concern is the speed of governance: Bitcoin evolves extremely slowly. Any contentious issue, if the community cannot agree on a suitable solution, could trigger a destructive hard fork.
The second concern is that Bitcoin's transition to quantum signatures cannot be accomplished through passive migration: coin holders must actively migrate their funds. This means that coins that have been abandoned but are still exposed to quantum threats will not be protected. Some estimates suggest that the number of quantum-fragile and potentially abandoned BTC is in the millions, worth hundreds of billions of dollars at current prices (as of December 2025).
However, the quantum threat won't cause a sudden, catastrophic overnight collapse of Bitcoin… it's more likely to manifest as a selective, gradual attack. Quantum computers won't break all encryption schemes at once—Shor's algorithm must crack the public keys target by target. Early quantum attacks will be extremely costly and slow. Therefore, once a quantum computer can crack a single Bitcoin signature key, attackers will prioritize targeting the most valuable wallets.
Furthermore, as long as users avoid address reuse and do not use Taproot addresses (which expose public keys directly on-chain), they are essentially protected even if the protocol itself has not been upgraded: their public keys remain hidden behind the hash function until they are spent. The public key only becomes public when they finally broadcast a spending transaction, at which point there is a brief "instant race window": honest users need to confirm their transactions as quickly as possible, while quantum attackers attempt to find the private key before the transaction is confirmed and spend the coin first. Therefore, the truly vulnerable coins are those whose public keys have been exposed for years: early P2PK outputs, reused addresses, and Taproot holdings.
For those fragile coins that have already been abandoned, there are currently no easy solutions. Alternative solutions include:
The Bitcoin community has reached a consensus to set a "flag day," after which all unmigrated coins will be considered destroyed.
Allow any abandoned coins exposed to quantum risks to be seized by anyone who owns CRQC.
The second option raises serious legal and security issues. Using quantum computers to access funds without a private key—even if claimed to be for legitimate ownership or in good faith—would fall under theft and computer fraud laws in many jurisdictions.
Furthermore, the concept of "abandonment" is based on the assumption of inactivity, but no one can know for sure whether these coins have truly lost an active holder with the key. Even if someone can prove they once held these coins, they may not have the legal right to break the cryptographic protections to "reclaim" them. This legal ambiguity makes these abandoned coins, exposed to quantum risks, highly likely to fall into the hands of malicious attackers who disregard legal constraints.
Another unique problem with Bitcoin is its extremely low transaction throughput. Even if the migration plan is finalized, it will still take months at Bitcoin's current transaction rate to move all funds exposed to quantum threats to post-quantum-secure addresses.
These challenges necessitate that Bitcoin begin planning for its post-quantum migration now—not because CRQC is likely to emerge before 2030, but because coordinating governance, reaching consensus, and the technical logistics of actually migrating hundreds of billions of dollars will take many years to complete.
The quantum threat facing Bitcoin is real, but the time pressure stems from Bitcoin's own structural constraints, rather than the approaching quantum computers. Other blockchains also face the problem of quantum-fragile funds, but Bitcoin is particularly unique: its earliest transactions used pay-to-public-key (P2PK) outputs, directly exposing public keys on-chain, leaving a significant proportion of BTC exposed to the quantum threat. Its technological history, coupled with its long chain age, high value concentration, low throughput, and rigid governance, makes the problem exceptionally severe.
It's important to note that the vulnerabilities described above only apply to the cryptographic security of Bitcoin's digital signatures—they do not affect the economic security of the Bitcoin blockchain. Bitcoin's economic security stems from its Proof-of-Work (PoW) consensus mechanism, which is not as vulnerable to quantum attacks as signature schemes, for three reasons:
PoW relies on hash functions, so it can only benefit from quadratic speedup provided by Grover's search algorithm, and not from exponential speedup provided by Shor's algorithm.
The actual overhead of implementing Grover's search is so great that it is extremely unlikely that any quantum computer could achieve even a limited real speedup on Bitcoin's PoW.
Even if quantum computers can achieve significant speedups, the effect will only give large miners with quantum computing power a relative advantage, rather than fundamentally undermining the Bitcoin economic security model.
Costs and risks of post-quantum signature
To understand why blockchains should not rush into deploying post-quantum signatures, we need to consider both the efficiency costs and our confidence that post-quantum security is still evolving.
Most post-quantum cryptography is based on one of the following five methods: hashing, codes (error-correcting codes), lattices, multivariable quadratic equations (MQ), and isogenies.
Why are there five different methods? The reason is that the security of any post-quantum cryptographic primitive relies on the assumption that quantum computers cannot efficiently solve a particular mathematical problem. The "stronger" the structure of the problem, the more efficient the cryptographic protocol we can construct.
However, this is a double-edged sword: more structures also mean a larger attack surface, making the algorithm easier to break. This creates a fundamental tension—stronger assumptions lead to better performance, but at the cost of potential security vulnerabilities (i.e., a higher probability that the assumptions will be proven wrong).
Overall, from a security perspective, hash-based methods are the most conservative and robust, as we are most confident that quantum computers cannot efficiently attack them. However, they are also the least efficient. For example, the NIST-standardized hash signature scheme, even with the smallest parameter settings, produces a signature size of 7–8 KB. In contrast, current elliptic curve-based digital signatures are only 64 bytes, about 100 times smaller.
Lattice schemes are a key focus of current deployments. The only cryptographic scheme selected by NIST, and two of the three signature algorithms, are based on lattices. One lattice signature (ML-DSA, formerly Dilithium) has a signature size of 2.4 KB at 128-bit security and 4.6 KB at 256-bit security—approximately 40–70 times larger than current elliptic curve signatures. Another lattice scheme, Falcon, has even smaller signatures (666 bytes for Falcon-512 and 1.3 KB for Falcon-1024), but relies on complex floating-point operations, which NIST itself considers a major challenge in implementation. Thomas Pornin, one of Falcon's designers, called it "the most complex cryptographic algorithm I've ever implemented."
In terms of implementation security, grid signatures are much more difficult to implement than elliptic curve schemes: ML-DSA contains more sensitive intermediate values and complex rejection sampling logic, all of which require protection against side-channel and fault-based attacks. Falcon further increases the complexity of constant-time floating-point operations; several side-channel attacks targeting Falcon implementations have successfully recovered private keys.
The risks posed by these problems are immediate, unlike the distant threat of "quantum computers that pose a real threat to cryptography."
There are good reasons to be cautious about more efficient post-quantum cryptography schemes. Historically leading schemes, such as Rainbow (MQ-based signature) and SIKE/SIDH (same-origin encryption), have been broken "classically"—that is, by today's computers, not quantum computers.
This occurred at a stage where the NIST standardization process was already well underway. This certainly reflects a healthy scientific process, but it also illustrates that premature standardization and deployment can be counterproductive.
As mentioned earlier, network infrastructure is proceeding with the signature migration in a cautious manner. This is noteworthy because once a cryptographic transition on the network begins, it often takes years to complete. Even though hash functions such as MD5 and SHA-1 have been officially deprecated by network standards bodies for many years, their actual migration has continued for years and they are still not completely phased out in some scenarios. These algorithms have been completely cracked, not just "potentially cracked someday in the future."
The unique challenges of blockchain vs. network infrastructure
Fortunately, blockchains maintained by the open-source community (such as Ethereum and Solana) are easier and faster to upgrade than traditional network infrastructure. On the other hand, network infrastructure benefits from frequent key rotation, meaning that the attack surface changes much faster than early quantum computers could keep up—something blockchains lack because coins and their keys can be exposed indefinitely. Overall, however, blockchains should still adopt the cautious approach used by networks in advancing signature migration. Both are unaffected by signature-based HNDL attacks, and the costs and risks of prematurely migrating to an immature post-quantum solution remain significant and do not change with the length of the key's lifespan.
Furthermore, blockchain presents several challenges that make premature migration particularly dangerous and complex: for example, blockchain has unique requirements for signature schemes, especially the need for "rapid aggregation of large numbers of signatures." Currently popular BLS signatures are favored for their efficient aggregation capabilities, but they lack post-quantum security. Researchers are exploring SNARK-based post-quantum signature aggregation schemes. While progress is promising, it remains in its early stages.
Regarding SNARKs themselves, the current community primarily focuses on hash-based post-quantum structures. However, a significant shift is on the horizon: I am confident that lattice schemes will become a highly attractive alternative in the coming months and years. They will offer superior performance in several dimensions, such as shorter proof lengths—lattice-like signatures are shorter than hash-like signatures.
A more serious problem now: operational safety
In the years to come, practical vulnerabilities will be far more real and serious than "quantum computers that truly threaten cryptography." For SNARK, the primary concern is vulnerabilities (bugs).
Vulnerabilities have been a major challenge in digital signature and encryption algorithms, and SNARKs are far more complex. In fact, a digital signature scheme can be viewed as an extremely simplified zkSNARK used to prove "I know the private key corresponding to the public key, and I authorized this message."
For post-quantum signatures, the truly pressing risks also include implementation attacks, such as side-channel attacks and fault injection attacks. These types of attacks have been extensively demonstrated and can extract private keys from real-world systems. Their threat is far more immediate than "quantum attacks in the distant future."
The community will continue to identify and fix SNARK vulnerabilities over the next few years and strengthen the implementation of post-quantum signatures to defend against side-channel and fault injection attacks. Migrating prematurely before post-quantum SNARK and signature aggregation solutions are fully developed risks locking the blockchain into a suboptimal solution—which may be forced to migrate again once a better solution emerges or the current solution exposes significant implementation vulnerabilities.
What should we do? Seven suggestions
Based on the realities discussed above, I offer the following advice to various stakeholders—from developers to policymakers. The general principle is: take the quantum threat seriously, but do not act under the premise that "quantum computers posing a real threat to cryptography will inevitably emerge before 2030." Current technological advancements do not support this premise. However, there is still much preparatory work we can and should undertake:
1. Deploy hybrid encryption immediately
At least in scenarios where long-term confidentiality is critical and performance costs are acceptable, many browsers, CDNs, and messaging applications (such as iMessage and Signal) have deployed hybrid solutions. These hybrid solutions—post-quantum + classical cryptography—offer protection against HNDL attacks while also mitigating the inherent weaknesses of post-quantum solutions.
2. Use hash stamps immediately in scenarios where large stamp sizes can be tolerated.
Hybrid hash signatures should be adopted immediately for low-frequency, size-insensitive scenarios such as software/firmware updates. (The hybrid approach is to prevent implementation vulnerabilities in new schemes, not because the security assumptions of hash signatures are questionable.) This is a conservative and prudent approach, providing society with a clear "lifeboat" in case quantum computers arrive unexpectedly. Without a deployed post-quantum signature software update mechanism, we will face a bootstrapping problem after the advent of CRQC: we will be unable to securely distribute the cryptographic updates needed to defend against quantum threats.
3. Blockchain does not need to rush into deploying post-quantum signatures—but planning should begin now.
Blockchain developers should learn from Web PKI's approach and proceed with the deployment of post-quantum signatures in a prudent manner. This allows post-quantum signature solutions time to mature further in terms of performance and security understanding. It also gives developers time to redesign systems to accommodate larger signatures and develop better aggregation technologies. For Bitcoin and other layers of the blockchain: the community needs to develop migration paths and policies regarding quantum-fragile and abandoned funds. Passive migration is not an option, so planning is crucial. And the challenges facing Bitcoin are mostly non-technical—slow governance and a large number of high-value, potentially abandoned, quantum-fragile addresses—further highlighting the need for the Bitcoin community to begin planning early.
At the same time, research on post-quantum SNARKs and aggregated signatures needs to continue maturing (which may take several more years). Again, premature migration could lead to being locked into suboptimal solutions or being forced to migrate again after practical vulnerabilities are discovered.
A note about the Ethereum account model: Ethereum supports two account types that have different impacts on post-quantum migration: External Accounts (EOAs) controlled by the secp256k1 private key, and smart contract wallets with programmable authorization logic.
In non-emergency scenarios, when Ethereum adds support for post-quantum signatures, upgradable smart contract wallets can switch to post-quantum verification via contract upgrades—while EOAs may need to transfer assets to a new post-quantum secure address (although Ethereum may also provide a dedicated migration mechanism for EOAs). In quantum emergency scenarios, Ethereum researchers have proposed a hard fork solution: freezing vulnerable accounts and allowing users to recover their assets using a post-quantum secure SNARK by proving they possess the seed phrase. This mechanism applies to both EOAs and unupgraded smart wallets.
The practical impact on users is that a well-audited and upgradable smart wallet might offer a slightly smoother migration path—but the difference is minor, and it comes with a trade-off in trust regarding wallet providers and upgrade governance. More important than account types is the Ethereum community's continued push for post-quantum primitives and contingency plans.
A broader design lesson: Many blockchains tightly couple account identity to specific cryptographic primitives—for example, Bitcoin and Ethereum are both bound to secp256k1, while other chains are bound to EdDSA. The difficulties of post-quantum migration highlight the value of decoupling account identity from specific signature schemes. Ethereum's evolution towards smart accounts, and the trend towards account abstraction on other chains, exemplify this direction: allowing accounts to upgrade their authentication logic while preserving on-chain history and state. This doesn't make post-quantum migration simpler, but it significantly increases flexibility compared to pinning accounts to a single signature scheme. (This also enables other functionalities such as proxy transactions, social recovery, multi-signature, etc.)
4. For the privacy chain, migration should be prioritized as long as performance allows.
These chains encrypt or hide transaction details, thus exposing user privacy to HNDL attacks—although the severity varies by design. Chains whose public ledgers alone are sufficient for complete post-transaction deanonymization are at the highest risk. Hybrid solutions (post-quantum + classical) can be adopted in case the post-quantum solution itself proves insecure in classical scenarios, or architectural modifications can be used to avoid placing decryptable secrets on-chain.
5. Prioritize practical security over quantum threat mitigation in the near term.
Especially for complex primitives like SNARKs and post-quantum signatures, vulnerabilities and exploits (side-channel attacks, fault injection) will be far more realistic and urgent than CRQC in the coming years. We should invest now in auditing, fuzzing, formal verification, and multi-layered defenses—don't let concerns about quantum threats overshadow the more pressing real threat of vulnerabilities!
6. Support the development of quantum computing
From a national security perspective, we must continue to invest in quantum computing research and development and talent cultivation. If major adversaries achieve CRQC (Computational Quantum Computing Capability) before the United States, it will pose a serious national security risk to the United States and the world.
7. Maintain a correct perspective on announcements related to quantum computing.
As quantum hardware matures, the coming years will see a plethora of milestone announcements. Paradoxically, the very frequency of these announcements is evidence that we are still quite far from CRQC: each milestone is merely one of many bridges to the ultimate goal, and each crossing generates a wave of media attention and excitement. Press releases should be viewed as progress reports requiring critical evaluation, not as signals demanding immediate action.
Of course, there may be unexpected breakthroughs in the future that accelerate the timeline; or there may be serious bottlenecks that slow down the timeline.
I want to emphasize that I don't believe CRQC is "absolutely impossible" within five years, only "extremely unlikely." The above recommendations are robust to this uncertainty and can help us avoid more direct and realistic risks: vulnerabilities, hasty deployments, and various errors common in password migrations.




