Contrary to popular belief, quantum computers will not “crack” Bitcoin encryption; instead, any realistic threat would focus on exploiting digital signatures tiedContrary to popular belief, quantum computers will not “crack” Bitcoin encryption; instead, any realistic threat would focus on exploiting digital signatures tied

Bitcoin encryption isn’t at risk from quantum computers for one simple reason: it doesn’t actually exist

Contrary to popular belief, quantum computers will not “crack” Bitcoin encryption; instead, any realistic threat would focus on exploiting digital signatures tied to exposed public keys.

Quantum computers cannot decrypt Bitcoin because it stores no encrypted secrets on-chain.

Ownership is enforced by digital signatures and hash-based commitments, not ciphertext.

The quantum risk that matters is the risk of authorization forgery.

If a cryptographically relevant quantum computer can run Shor’s algorithm against Bitcoin’s elliptic-curve cryptography, it could derive a private key from an on-chain public key and then produce a valid signature for a competing spend.

Much of the “quantum breaks Bitcoin encryption” framing is a terminology error. Adam Back, longtime Bitcoin developer and Hashcash inventor, summed it up on X:

A separate post made the same distinction more explicitly, noting that a quantum attacker would not “decrypt” anything, but would instead use Shor’s algorithm to derive a private key from an exposed public key:

Why public-key exposure, not encryption, is Bitcoin’s real security bottleneck

Bitcoin’s signature systems, ECDSA and Schnorr, are used to prove control over a keypair.

In that model, coins are taken by producing a signature that the network will accept.

That is why public-key exposure is the pivot.

Whether an output is exposed depends on what appears on-chain.

Many address formats commit to a hash of a public key, so the raw public key is not revealed until the transaction is spent.

That narrows the window for an attacker to compute a private key and publish a conflicting transaction.

Other script types expose a public key earlier, and address reuse can turn a one-time reveal into a persistent target.

Project Eleven’s open-source “Bitcoin Risq List” query defines exposure at the script and reuse level.

It maps where a public key is already available to a would-be Shor attacker.

Why quantum risk is measurable today, even if it isn’t imminent

Taproot changes the exposure pattern in a way that matters only if large fault-tolerant machines arrive.

Taproot outputs (P2TR) include a 32-byte tweaked public key in the output program, rather than a pubkey hash, as described in BIP 341.

Project Eleven’s query documentation includes P2TR alongside pay-to-pubkey and some multisig forms as categories where public keys are visible in outputs.

That does not create a new vulnerability today.

However, it changes what becomes exposed by default if key recovery becomes feasible.

Because exposure is measurable, the vulnerable pool can be tracked today without pinning down a quantum timeline.

Project Eleven says it runs an automated weekly scan and publishes a “Bitcoin Risq List” concept intended to cover every quantum-vulnerable address and its balance, detailed in its methodology post.

Its public tracker shows a headline figure of about 6.7 million BTC that meet its exposure criteria.

QuantityOrder of magnitudeSource
BTC in “quantum-vulnerable” addresses (public key exposed)~6.7M BTCProject Eleven
Logical qubits for 256-bit prime-field ECC discrete log (upper bound)~2,330 logical qubitsRoetteler et al.
Physical-qubit scale example tied to a 10-minute key-recovery setup~6.9M physical qubitsLitinski
Physical-qubit scale reference tied to a 1-day key-recovery setup~13M physical qubitsSchneier on Security

On the computational side, the key distinction is between logical qubits and physical qubits.

In the paper “Quantum resource estimates for computing elliptic curve discrete logarithms,” Roetteler and co-authors give an upper bound of at most 9n + 2⌈log2(n)⌉ + 10 logical qubits to compute an elliptic-curve discrete logarithm over an n-bit prime field.

For n = 256, that works out to about 2,330 logical qubits.

Converting that into an error-corrected machine that can run a deep circuit at low failure rates is where physical-qubit overhead and timing dominate.

Architecture choices then set a wide range of runtimes

Litinski’s 2023 estimate puts a 256-bit elliptic-curve private-key computation at about 50 million Toffoli gates.

Under its assumptions, a modular approach could compute one key in about 10 minutes using about 6.9 million physical qubits.

In a Schneier on Security summary of related work, estimates cluster around 13 million physical qubits to break within one day.

The same line of estimates also cites about 317 million physical qubits to target a one-hour window, depending on timing and error-rate assumptions.

For Bitcoin operations, the nearer levers are behavioral and protocol-level.

Address reuse raises exposure, and wallet design can reduce it.

Project Eleven’s wallet analysis notes that once a public key is on-chain, future receipts back to that same address remain exposed.

If key recovery ever fit inside a block interval, an attacker would be racing spends from exposed outputs, not rewriting consensus history.

Hashing is often bundled into the narrative, but the quantum lever there is Grover’s algorithm.

Grover provides a square-root speedup for brute-force search rather than the discrete-log break Shor provides.

NIST research on the practical cost of Grover-style attacks stresses that overhead and error correction shape system-level cost.

In the idealized model, for SHA-256 preimages, the target remains on the order of 2^128 work after Grover.

That is not comparable to an ECC discrete-log break.

That leaves signature migration, where the constraints are bandwidth, storage, fees, and coordination.

Post-quantum signatures are often kilobytes rather than the tens of bytes users are accustomed to.

That changes transaction weight economics and wallet UX.

Why quantum risk is a migration challenge, not an immediate threat

Outside Bitcoin, NIST has standardized post-quantum primitives such as ML-KEM (FIPS 203) as part of broader migration planning.

Inside Bitcoin, BIP 360 proposes a “Pay to Quantum Resistant Hash” output type.

Meanwhile, qbip.org argues for a legacy-signature sunset to force migration incentives and reduce the long tail of exposed keys.

Recent corporate roadmaps add context for why the topic is framed as infrastructure rather than an emergency.

In a recent Reuters report, IBM discussed progress on error-correction components and reiterated a path toward a fault-tolerant system around 2029.

Reuters also covered IBM’s claim that a key quantum error-correction algorithm can run on conventional AMD chips, in a separate report.

In that framing, “quantum breaks Bitcoin encryption” fails on terminology and on mechanics.

The measurable items are how much of the UTXO set has exposed public keys, how wallet behavior changes in response to that exposure, and how quickly the network can adopt quantum-resistant spending paths while keeping validation and fee-market constraints intact.

The post Bitcoin encryption isn’t at risk from quantum computers for one simple reason: it doesn’t actually exist appeared first on CryptoSlate.

Market Opportunity
Threshold Logo
Threshold Price(T)
$0.00894
$0.00894$0.00894
-0.08%
USD
Threshold (T) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Building a DEXScreener Clone: A Step-by-Step Guide

Building a DEXScreener Clone: A Step-by-Step Guide

DEX Screener is used by crypto traders who need access to on-chain data like trading volumes, liquidity, and token prices. This information allows them to analyze trends, monitor new listings, and make informed investment decisions. In this tutorial, I will build a DEXScreener clone from scratch, covering everything from the initial design to a functional app. We will use Streamlit, a Python framework for building full-stack apps.
Share
Hackernoon2025/09/18 15:05
Which DOGE? Musk's Cryptic Post Explodes Confusion

Which DOGE? Musk's Cryptic Post Explodes Confusion

A viral chart documenting a sharp decline in U.S. federal employment during President Trump's second term has sparked unexpected confusion in cryptocurrency markets
Share
Coinstats2025/12/20 01:13
Google's AP2 protocol has been released. Does encrypted AI still have a chance?

Google's AP2 protocol has been released. Does encrypted AI still have a chance?

Following the MCP and A2A protocols, the AI Agent market has seen another blockbuster arrival: the Agent Payments Protocol (AP2), developed by Google. This will clearly further enhance AI Agents' autonomous multi-tasking capabilities, but the unfortunate reality is that it has little to do with web3AI. Let's take a closer look: What problem does AP2 solve? Simply put, the MCP protocol is like a universal hook, enabling AI agents to connect to various external tools and data sources; A2A is a team collaboration communication protocol that allows multiple AI agents to cooperate with each other to complete complex tasks; AP2 completes the last piece of the puzzle - payment capability. In other words, MCP opens up connectivity, A2A promotes collaboration efficiency, and AP2 achieves value exchange. The arrival of AP2 truly injects "soul" into the autonomous collaboration and task execution of Multi-Agents. Imagine AI Agents connecting Qunar, Meituan, and Didi to complete the booking of flights, hotels, and car rentals, but then getting stuck at the point of "self-payment." What's the point of all that multitasking? So, remember this: AP2 is an extension of MCP+A2A, solving the last mile problem of AI Agent automated execution. What are the technical highlights of AP2? The core innovation of AP2 is the Mandates mechanism, which is divided into real-time authorization mode and delegated authorization mode. Real-time authorization is easy to understand. The AI Agent finds the product and shows it to you. The operation can only be performed after the user signs. Delegated authorization requires the user to set rules in advance, such as only buying the iPhone 17 when the price drops to 5,000. The AI Agent monitors the trigger conditions and executes automatically. The implementation logic is cryptographically signed using Verifiable Credentials (VCs). Users can set complex commission conditions, including price ranges, time limits, and payment method priorities, forming a tamper-proof digital contract. Once signed, the AI Agent executes according to the conditions, with VCs ensuring auditability and security at every step. Of particular note is the "A2A x402" extension, a technical component developed by Google specifically for crypto payments, developed in collaboration with Coinbase and the Ethereum Foundation. This extension enables AI Agents to seamlessly process stablecoins, ETH, and other blockchain assets, supporting native payment scenarios within the Web3 ecosystem. What kind of imagination space can AP2 bring? After analyzing the technical principles, do you think that's it? Yes, in fact, the AP2 is boring when it is disassembled alone. Its real charm lies in connecting and opening up the "MCP+A2A+AP2" technology stack, completely opening up the complete link of AI Agent's autonomous analysis+execution+payment. From now on, AI Agents can open up many application scenarios. For example, AI Agents for stock investment and financial management can help us monitor the market 24/7 and conduct independent transactions. Enterprise procurement AI Agents can automatically replenish and renew without human intervention. AP2's complementary payment capabilities will further expand the penetration of the Agent-to-Agent economy into more scenarios. Google obviously understands that after the technical framework is established, the ecological implementation must be relied upon, so it has brought in more than 60 partners to develop it, almost covering the entire payment and business ecosystem. Interestingly, it also involves major Crypto players such as Ethereum, Coinbase, MetaMask, and Sui. Combined with the current trend of currency and stock integration, the imagination space has been doubled. Is web3 AI really dead? Not entirely. Google's AP2 looks complete, but it only achieves technical compatibility with Crypto payments. It can only be regarded as an extension of the traditional authorization framework and belongs to the category of automated execution. There is a "paradigm" difference between it and the autonomous asset management pursued by pure Crypto native solutions. The Crypto-native solutions under exploration are taking the "decentralized custody + on-chain verification" route, including AI Agent autonomous asset management, AI Agent autonomous transactions (DeFAI), AI Agent digital identity and on-chain reputation system (ERC-8004...), AI Agent on-chain governance DAO framework, AI Agent NPC and digital avatars, and many other interesting and fun directions. Ultimately, once users get used to AI Agent payments in traditional fields, their acceptance of AI Agents autonomously owning digital assets will also increase. And for those scenarios that AP2 cannot reach, such as anonymous transactions, censorship-resistant payments, and decentralized asset management, there will always be a time for crypto-native solutions to show their strength? The two are more likely to be complementary rather than competitive, but to be honest, the key technological advancements behind AI Agents currently all come from web2AI, and web3AI still needs to keep up the good work!
Share
PANews2025/09/18 07:00