Skip to main content

Post-Quantum Cryptography: What Web Developers Need to Know

March 24, 2026

In August 2024, NIST published three finalized post-quantum cryptography (PQC) standards. In early 2025, the US government set a federal mandate: all government systems must support post-quantum algorithms by 2027. By late 2025, Chrome and Firefox shipped hybrid post-quantum TLS by default.

If you build web applications, this affects you — not in some distant future, but now. The certificates your servers use, the JWTs your APIs issue, and the encryption libraries your code depends on are all in the process of changing. Understanding what is happening and what you need to do is no longer optional security trivia. It is practical engineering work.

Why Quantum Computing Threatens Current Cryptography

Modern web security rests on three types of cryptographic algorithms:

PurposeAlgorithmUsed In
Key exchangeECDH, DHTLS handshake
Digital signaturesRSA, ECDSA, Ed25519TLS certificates, JWTs, code signing
Symmetric encryptionAES-128/256Data encryption after handshake

Quantum computers threaten the first two categories — key exchange and digital signatures — through Shor's algorithm. Here is the simplified version:

  • RSA relies on the difficulty of factoring large numbers. A quantum computer running Shor's algorithm can factor them in polynomial time.
  • ECDH/ECDSA rely on the difficulty of the elliptic curve discrete logarithm problem. Shor's algorithm solves this too.
  • AES is based on symmetric key cryptography. Grover's algorithm gives a quadratic speedup, meaning AES-128 drops to 64-bit effective security. AES-256 drops to 128-bit — still secure.

The practical consequence:

┌────────────────────────────────────────────────────────┐
              Quantum Threat Model                       
                                                         
  Algorithm          Classical Security  Post-Quantum  
 ───────────────────┼───────────────────┼─────────────  
  RSA-2048           112 bits            BROKEN        
  ECDH P-256         128 bits            BROKEN        
  ECDSA P-256        128 bits            BROKEN        
  Ed25519            128 bits            BROKEN        
  AES-128            128 bits            64 bits (weak)
  AES-256            256 bits            128 bits (ok) 
  SHA-256            256 bits            128 bits (ok) 
└────────────────────────────────────────────────────────┘

The "Harvest Now, Decrypt Later" Problem

You might think: "Quantum computers cannot break RSA today, so why worry now?" The answer is harvest now, decrypt later (HNDL).

Nation-state adversaries are recording encrypted traffic today. They cannot decrypt it now, but when large-scale quantum computers arrive, they will retroactively decrypt everything they have stored. If your application handles medical records, financial data, government communications, or any data with a long sensitivity window, the traffic you send today could be readable in 10-15 years.

This is not hypothetical. Intelligence agencies have confirmed HNDL collection programs exist. This is why the urgency to deploy post-quantum key exchange (which protects data in transit) is higher than the urgency for post-quantum signatures (which protect authentication, a real-time operation).

The Three NIST Standards

NIST finalized three post-quantum algorithms in FIPS 203, FIPS 204, and FIPS 205. Let me break each one down in terms a web developer can understand.

ML-KEM (FIPS 203) — Key Encapsulation

Replaces: ECDH (key exchange in TLS) Based on: Module-Lattice Key Encapsulation Mechanism (formerly CRYSTALS-Kyber) Used for: Establishing shared secrets for encrypted communication

ML-KEM is a key encapsulation mechanism (KEM), not a key exchange. The difference matters:

Traditional Key Exchange (ECDH):
  Alice  sends public key  Bob
  Bob    sends public key  Alice
  Both derive the same shared secret independently

Key Encapsulation (ML-KEM):
  Alice  sends public key  Bob
  Bob    encrypts a random secret with Alice's key → Alice
  Alice decrypts to get the shared secret

The end result is the same — both parties share a secret key for symmetric encryption. The mechanism is different, but from an API perspective, it drops into the same place in a TLS handshake.

Parameter sets:

VariantSecurity LevelPublic Key SizeCiphertext Size
ML-KEM-512NIST Level 1 (128-bit)800 bytes768 bytes
ML-KEM-768NIST Level 3 (192-bit)1,184 bytes1,088 bytes
ML-KEM-1024NIST Level 5 (256-bit)1,568 bytes1,568 bytes

Compare this to ECDH P-256, where the public key is 32 bytes. ML-KEM keys are 25-50x larger. This has real implications for TLS handshake size and performance, which we will cover later.

ML-DSA (FIPS 204) — Digital Signatures

Replaces: RSA, ECDSA, Ed25519 (signatures) Based on: Module-Lattice Digital Signature Algorithm (formerly CRYSTALS-Dilithium) Used for: TLS certificates, JWTs, code signing, document signing

ML-DSA is the general-purpose signature algorithm. If you sign anything — tokens, certificates, API requests — this is the replacement.

Parameter sets:

VariantSecurity LevelPublic KeySignature Size
ML-DSA-44NIST Level 21,312 bytes2,420 bytes
ML-DSA-65NIST Level 31,952 bytes3,309 bytes
ML-DSA-87NIST Level 52,592 bytes4,627 bytes

Compare to Ed25519: 32-byte public key, 64-byte signature. ML-DSA signatures are roughly 38-72x larger. This is the biggest practical impact for web developers — JWTs get much bigger, certificate chains get much bigger, and anything that embeds signatures needs to handle the size increase.

SLH-DSA (FIPS 205) — Stateless Hash-Based Signatures

Replaces: Same as ML-DSA (backup option) Based on: Stateless Hash-Based Digital Signature Algorithm (formerly SPHINCS+) Used for: Situations where you want conservative security assumptions

SLH-DSA is the "belt and suspenders" option. Its security relies only on hash function properties — no lattice math, no new assumptions. If someone discovers an attack on lattice-based cryptography that breaks ML-KEM and ML-DSA, SLH-DSA would still be secure.

The tradeoff is performance and size:

VariantPublic KeySignature SizeSign SpeedVerify Speed
SLH-DSA-128s32 bytes7,856 bytesSlowFast
SLH-DSA-128f32 bytes17,088 bytesFastFast
SLH-DSA-256s64 bytes29,792 bytesSlowFast

Signatures are enormous — up to 30 KB. This is impractical for JWTs or high-throughput signing. SLH-DSA is primarily interesting for root certificates, firmware signing, and other low-volume, high-security applications.

What Changes for TLS

The TLS handshake is where most web developers will encounter post-quantum cryptography first, because browsers and servers are already deploying it.

Hybrid Key Exchange

Chrome 124 (April 2024) and Firefox 128 (July 2024) enabled hybrid key exchange by default. "Hybrid" means the TLS handshake uses both a classical algorithm (X25519) and a post-quantum algorithm (ML-KEM-768) simultaneously:

Client Hello:
  Supported groups: X25519_ML-KEM-768, X25519, P-256
  Key shares: X25519_ML-KEM-768 (combined key)

Server Hello:
  Selected group: X25519_ML-KEM-768
  Key share: (combined key)

Shared secret = HKDF(X25519_secret || ML-KEM_secret)

The shared secret is derived from both algorithms. Even if ML-KEM is broken, X25519 still protects the session. Even if X25519 is broken by a quantum computer, ML-KEM protects it. You need to break both to compromise the session.

Impact on Handshake Size

The hybrid handshake is larger:

ComponentClassical (X25519)Hybrid (X25519 + ML-KEM-768)
Client key share32 bytes1,216 bytes
Server key share32 bytes1,120 bytes
Total added to handshake64 bytes2,336 bytes
Handshake round trips1 RTT1 RTT

The extra ~2.3 KB adds roughly 1-3ms of latency on typical connections. This is measurable but not significant for most applications. The round-trip count does not change — it is still a single RTT handshake.

What You Need to Do for TLS

If you use a managed platform (Vercel, AWS, Cloudflare): Nothing. These platforms have already enabled hybrid key exchange on their edge servers. Your users are already getting post-quantum TLS.

If you manage your own servers: Update OpenSSL to 3.5+ and configure your TLS stack:

# nginx.conf  Enable hybrid PQ key exchange
ssl_ecdh_curve X25519_ML-KEM-768:X25519:prime256v1;
ssl_protocols TLSv1.3;
# Verify your server supports PQ key exchange
openssl s_client -connect yourdomain.com:443 -groups X25519_ML-KEM-768

If you run an internal PKI: Your internal certificates do not need to change immediately. Post-quantum key exchange (ML-KEM) protects data in transit regardless of the certificate signature algorithm. Certificate migration to ML-DSA is lower urgency — see the timeline section below.

What Changes for JWTs

This is where post-quantum cryptography gets painful for web developers. JWTs are everywhere — API authentication, session tokens, webhook signatures — and they embed signatures directly in the token.

The Size Problem

A typical JWT signed with ES256 (ECDSA P-256):

eyJhbGciOiJFUzI1NiIsInR5cCI6IkpXVCJ9.
eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.
SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c

Total size: ~230 bytes

The same JWT signed with ML-DSA-65:

Total size: ~4,800 bytes (the signature alone is ~3,300 bytes)

That is a 21x increase in token size. If you pass JWTs in HTTP headers (which most APIs do), this is a real problem:

Authorization: Bearer eyJ...4800_bytes_of_token...

Some proxies, load balancers, and API gateways have header size limits (commonly 8 KB for all headers combined). A single ML-DSA JWT could consume half that budget.

Migration Strategy for JWTs

Option 1: Switch to symmetric JWTs where possible

If both the issuer and verifier are systems you control, use HS256 (HMAC-SHA256) instead of asymmetric signatures. HMAC is not affected by quantum computers (Grover's algorithm halves the security, but HS256 at 128-bit post-quantum is still fine).

// Before: Asymmetric (vulnerable to quantum)
const token = jwt.sign(payload, ecdsaPrivateKey, { algorithm: 'ES256' })

// After: Symmetric (quantum-safe)
const token = jwt.sign(payload, sharedSecret, { algorithm: 'HS256' })

This only works when you do not need third-party verification. If external services need to verify your tokens with a public key, you need asymmetric signatures.

Option 2: Use ML-DSA with shorter-lived tokens

Accept the larger token size but reduce risk by shortening token lifetimes:

import { sign, verify } from 'jose' // jose library supports ML-DSA

const token = await new SignJWT(payload)
  .setProtectedHeader({ alg: 'ML-DSA-65' })
  .setIssuedAt()
  .setExpirationTime('5m') // Short-lived
  .sign(mlDsaPrivateKey)

Short-lived tokens minimize the window for harvest-now-decrypt-later attacks and reduce the impact of token theft.

Option 3: Move signatures out of the token

Instead of embedding a large signature in every JWT, use a detached signature or a different token format:

// Issue a simple opaque token
const token = crypto.randomUUID()

// Store the claims server-side (Redis, database)
await redis.set(`session:${token}`, JSON.stringify(claims), 'EX', 300)

// Verify by lookup, not signature verification
const claims = JSON.parse(await redis.get(`session:${token}`))

This avoids the signature size problem entirely but requires server-side state. For many applications, this is the pragmatic choice.

JWT Algorithm Timeline

AlgorithmStatus (2026)Recommendation
RS256 (RSA)Quantum-vulnerableMigrate away
ES256 (ECDSA)Quantum-vulnerableMigrate away for long-lived tokens
EdDSA (Ed25519)Quantum-vulnerableSame as ES256
HS256 (HMAC)Quantum-safeUse where possible
ML-DSA-44NIST standardizedUse for new systems
ML-DSA-65NIST standardizedUse for higher security

What Changes for Certificates

X.509 certificates — the foundation of HTTPS — will eventually migrate to post-quantum signature algorithms. But the timeline is longer than for key exchange.

Why Certificates Are Less Urgent

Certificates are verified in real time. An attacker cannot "harvest" a certificate verification and decrypt it later — they would need a quantum computer at the time of the attack to forge a certificate. Since large-scale quantum computers do not exist yet, classical certificates remain secure for now.

The exception is long-lived certificates like root CAs, which have 20+ year lifetimes. If a quantum computer arrives within the validity period of a root CA certificate, that root could be forged. This is why certificate authorities are already planning migration.

The Certificate Chain Size Problem

A typical TLS certificate chain today:

Root CA cert:       ~1 KB (RSA-2048 signature)
Intermediate cert:  ~1 KB
Server cert:        ~1 KB
Total:              ~3 KB

A post-quantum certificate chain with ML-DSA-65:

Root CA cert:       ~5.5 KB (ML-DSA-65 signature + public key)
Intermediate cert:  ~5.5 KB
Server cert:        ~5.5 KB
Total:              ~16.5 KB

That is a 5.5x increase in the certificate data sent during every TLS handshake. Combined with the larger key exchange, a full post-quantum TLS handshake sends roughly 19 KB more data than a classical one. On slow mobile connections, this adds measurable latency.

Hybrid Certificates

The transition will likely use hybrid certificates that contain both a classical and a post-quantum signature:

Certificate:
  Subject: example.com
  Public Key: ECDSA P-256 (classical)
  Alt Public Key: ML-DSA-65 (post-quantum)
  Signature: ECDSA (from issuer, classical)
  Alt Signature: ML-DSA-65 (from issuer, post-quantum)

Clients that support PQ verify the ML-DSA signature. Legacy clients fall back to ECDSA. This is clunky but necessary for a smooth transition.

Practical Algorithm-Agile Design Patterns

The most important thing you can do right now is make your code algorithm-agile — able to switch cryptographic algorithms without major refactoring.

Pattern 1: Abstract the Algorithm Choice

// crypto-config.ts — Centralize algorithm selection
export const SIGNING_CONFIG = {
  algorithm: process.env.SIGNING_ALGORITHM || 'ES256',
  keyId: process.env.SIGNING_KEY_ID || 'key-2024-ecdsa',
} as const

export const ENCRYPTION_CONFIG = {
  algorithm: 'AES-256-GCM', // Already quantum-safe
} as const
// auth.ts — Uses config, not hardcoded algorithms
import { SIGNING_CONFIG } from './crypto-config'

export function signToken(payload: Record<string, unknown>) {
  const key = getKey(SIGNING_CONFIG.keyId)
  return jwt.sign(payload, key, {
    algorithm: SIGNING_CONFIG.algorithm,
    keyid: SIGNING_CONFIG.keyId,
  })
}

export function verifyToken(token: string) {
  const header = jwt.decode(token, { complete: true })?.header
  const key = getKey(header?.kid)
  return jwt.verify(token, key, {
    algorithms: getAllowedAlgorithms(), // Accept multiple algorithms during transition
  })
}

The key insight: verification should accept multiple algorithms during the transition period. You might issue new tokens with ML-DSA while still accepting ES256 tokens that have not expired yet.

Pattern 2: Key Rotation Infrastructure

// key-store.ts
interface KeyEntry {
  id: string
  algorithm: string
  publicKey: string
  privateKey?: string
  createdAt: Date
  expiresAt: Date
  status: 'active' | 'rotating' | 'retired'
}

class KeyStore {
  private keys: Map<string, KeyEntry> = new Map()

  // Get the current signing key
  getSigningKey(): KeyEntry {
    return [...this.keys.values()]
      .filter((k) => k.status === 'active' && k.privateKey)
      .sort((a, b) => b.createdAt.getTime() - a.createdAt.getTime())[0]
  }

  // Get all keys valid for verification (includes rotating keys)
  getVerificationKeys(): KeyEntry[] {
    return [...this.keys.values()]
      .filter((k) => k.status !== 'retired' && k.expiresAt > new Date())
  }

  // JWKS endpoint serves all non-retired public keys
  toJWKS() {
    return {
      keys: this.getVerificationKeys().map((k) => ({
        kid: k.id,
        alg: k.algorithm,
        ...JSON.parse(k.publicKey),
      })),
    }
  }
}

If you already have key rotation infrastructure, migrating to post-quantum is just adding a new key with a new algorithm. If you do not have this, building it now pays dividends regardless of quantum computing.

Pattern 3: JWKS-Based Verification

If your APIs verify JWTs from external issuers, use JWKS (JSON Web Key Set) discovery:

import { createRemoteJWKSet, jwtVerify } from 'jose'

// This automatically handles algorithm changes by the issuer
const JWKS = createRemoteJWKSet(
  new URL('https://auth.example.com/.well-known/jwks.json')
)

async function verifyExternalToken(token: string) {
  const { payload } = await jwtVerify(token, JWKS, {
    issuer: 'https://auth.example.com',
    audience: 'my-api',
  })
  return payload
}

When the issuer rotates to ML-DSA keys and publishes them in their JWKS endpoint, your verification code automatically picks up the new keys without any code changes. This is algorithm agility in practice.

The Hybrid Deployment Strategy

The industry consensus for migrating to post-quantum cryptography is a hybrid approach: use both classical and post-quantum algorithms simultaneously, then phase out the classical algorithms once confidence in the post-quantum standards matures.

Phase 1: Now (2026)

TLS Key Exchange:  X25519 + ML-KEM-768 (hybrid)  Already deployed
TLS Certificates:  ECDSA/RSA (classical)  no change yet
JWTs:              ES256/RS256 (classical)  start planning
Symmetric crypto:  AES-256  already quantum-safe

Action items:

  • Verify your TLS stack supports hybrid key exchange
  • Inventory all places your code does asymmetric cryptography
  • Build algorithm-agile abstractions around signing and verification
  • Upgrade to AES-256 if you are using AES-128

Phase 2: 2027

TLS Key Exchange:  X25519 + ML-KEM-768 (hybrid)
TLS Certificates:  Hybrid (ECDSA + ML-DSA)  early adopters
JWTs:              ML-DSA-44 for new tokens, accept ES256 for old
Symmetric crypto:  AES-256

Action items:

  • Deploy ML-DSA signing for new JWTs
  • Test post-quantum certificate chains with your infrastructure
  • Update libraries (jose, openssl, node:crypto) to versions with PQ support

Phase 3: 2029-2030

TLS Key Exchange:  ML-KEM-768 (post-quantum only)
TLS Certificates:  ML-DSA-65 (post-quantum only)
JWTs:              ML-DSA-44/65 (post-quantum only)
Symmetric crypto:  AES-256

Action items:

  • Remove classical algorithm support
  • Retire ECDSA/RSA keys
  • Update compliance documentation

Library Support Cheat Sheet

Where does post-quantum support stand in the libraries you actually use?

LibraryML-KEMML-DSASLH-DSAStatus
OpenSSL 3.5+YesYesYesStable
BoringSSL (Chrome)YesYesNoStable
Node.js 23+ (node:crypto)YesYesPartialStable
jose (JS JWT library)YesYesNoStable
Go crypto (1.24+)YesYesYesStable
Rust pqcryptoYesYesYesStable
Python pqcryptoYesYesYesStable
AWS KMSYesYesNoGA
GCP Cloud KMSYesYesNoGA
Azure Key VaultYesPartialNoPreview

If your language or platform is not listed, check for OQS (Open Quantum Safe) bindings — they provide PQ algorithm implementations for most major languages.

What You Should Do This Week

If you take away one thing from this post, it is this: the migration has already started, and you need to be aware of it even if you do not act on every piece immediately.

Here is a prioritized checklist:

  1. Verify your TLS — Run openssl s_client -connect yourdomain.com:443 and check if X25519_ML-KEM-768 appears in the key exchange. If you are on a managed platform, it probably does already.

  2. Inventory your cryptography — Search your codebase for RS256, ES256, RSA, ECDSA, Ed25519. Each occurrence is a future migration point.

  3. Check your JWT sizes — If you are passing JWTs in HTTP headers, measure the current size and calculate what ML-DSA would add. Identify if any proxies or gateways have header size limits.

  4. Upgrade symmetric crypto — If any code uses AES-128, upgrade to AES-256. This is the easiest quantum hardening step.

  5. Build algorithm agility — Centralize your algorithm choices in config. Use JWKS for verification. Support multiple algorithms during transition periods.

  6. Update your dependencies — Ensure your crypto libraries are recent enough to support PQ algorithms when you need them.

The quantum threat is not immediate — we likely have years before cryptographically relevant quantum computers exist. But the migration is a multi-year process, and "harvest now, decrypt later" means the key exchange upgrade is genuinely urgent for sensitive data. Start with the easy wins, build the infrastructure for agility, and you will be in good shape when the full transition happens.

Recommended Posts