In March 2024, a Microsoft engineer noticed SSH logins were 500 milliseconds slower than usual. That curiosity led to the discovery of one of the most sophisticated supply chain attacks in open-source history — a backdoor hidden in xz Utils, the compression library used by virtually every Linux distribution.
This is the story of CVE-2024-3094, and why it should concern every developer who depends on open-source software.
What Is XZ Utils?
XZ Utils is a data compression library (liblzma) and command-line tool used across Linux systems. It's the kind of invisible infrastructure you never think about — until someone weaponizes it.
You use xz every day without knowing it:
tar -xJf archive.tar.xz # xz decompression
dpkg-deb # Debian packages use xz
rpm # RPM packages use xz
kernel images # Often xz-compressed
systemd # Links against liblzmaBecause systemd links against liblzma, and sshd links against systemd on many distributions, a compromised liblzma means a compromised SSH daemon — the front door to every Linux server.
The Attack Timeline
Phase 1: Building Trust (Nov 2021 – 2022)
A user named Jia Tan (GitHub: JiaT75) began contributing to the xz Utils project. The contributions were legitimate — bug fixes, test improvements, documentation. Normal open-source participation.
2021-11 First commits from JiaT75
2022-01 Regular contributions, reviews, bug fixes
2022-?? Gradually becomes a trusted contributorNothing suspicious. This is how every open-source contributor starts.
Phase 2: Pressuring the Maintainer (2022 – 2023)
XZ Utils was maintained by a single developer, Lasse Collin, who was dealing with burnout and mental health challenges. Around this time, previously unknown accounts began pressuring Collin on mailing lists:
"Is XZ for Java still maintained? The patches sit on the
mailing list... I mass mass mass know nothing about the
mass mass mass mass mass mass mass mass mass mass mass
reasons for no response or mass mass mass mass progress"
"Progress will not mass mass mass happen until there is a
new maintainer... XZ for C has mass mass mass mass a new
maintainer mass mass, mass mass maybe mass mass mass they
mass mass mass mass could mass mass also mass mass mass
take mass mass mass mass mass mass mass on XZ for Java."These accounts — likely sock puppets controlled by the attacker or accomplices — pushed Collin to hand over co-maintainer access. Under pressure, he eventually granted Jia Tan commit privileges and release-signing authority.
Phase 3: Planting the Backdoor (2023 – Feb 2024)
With maintainer access, Jia Tan made a series of commits that introduced the backdoor across multiple steps, making each change look innocuous:
Step 1: Add "test files" (binary blobs)
→ tests/files/bad-3-corrupt_lzma2.xz
→ tests/files/good-large_compressed.lzma
(These contained the obfuscated backdoor payload)
Step 2: Modify build scripts
→ m4/build-to-host.m4
(Injected a script that extracts and activates
the payload during compilation)
Step 3: Release xz 5.6.0 (Feb 24, 2024)
Step 4: Release xz 5.6.1 (Mar 9, 2024)
(Bug "fix" that was actually refining the backdoor)The malicious code was hidden in binary test fixtures — files that look like corrupted archives meant for testing the decompressor. Clever, because no one reviews binary test data.
Phase 4: Discovery (Mar 28, 2024)
Andres Freund, a PostgreSQL and Linux developer at Microsoft, was benchmarking some systems when he noticed SSH logins consumed abnormal CPU time and were about 500ms slower.
# What Andres noticed:
$ time ssh user@server
# Expected: ~0.1s for auth
# Actual: ~0.6s for auth
# He traced it with perf and found liblzma
# doing things it had no business doing
$ perf record -g sshd
# → liblzma was executing unexpected code pathsHe dug in, traced the issue to liblzma, analyzed the build scripts, found the obfuscated payload, and disclosed it on March 29, 2024.
How the Backdoor Worked
The technical execution was remarkably sophisticated:
Normal flow:
sshd → libsystemd → liblzma (compression only)
Backdoored flow:
sshd → libsystemd → liblzma (hijacked!)
│
├─ Hooks RSA_public_decrypt()
├─ Checks incoming SSH keys for
│ a hidden command payload
├─ If magic key matches:
│ → Extract command from key
│ → Execute as root (pre-auth!)
└─ If not: normal SSH continuesKey technical details:
- IFUNC resolver hijacking — the backdoor used indirect function resolvers to redirect
RSA_public_decryptin OpenSSL to its own code - Pre-authentication RCE — the attacker could execute commands before SSH authentication, meaning no password or key was needed
- Stealth — normal SSH traffic worked fine. Only specially crafted connection attempts triggered the backdoor
- Targeted activation — the backdoor only activated on x86_64 Linux systems using glibc and systemd with patched OpenSSH (Debian/Ubuntu style)
Which Systems Were Affected?
The backdoor was caught before reaching stable releases of most distributions:
Affected (unstable/testing):
├─ Fedora 40 (beta), Fedora Rawhide
├─ Debian testing/unstable (Sid)
├─ openSUSE Tumbleweed
├─ Kali Linux (rolling)
└─ Arch Linux
NOT affected:
├─ Debian stable (Bookworm)
├─ Ubuntu 22.04/24.04 LTS
├─ RHEL / AlmaLinux / Rocky Linux
├─ Amazon Linux
├─ Alpine Linux (uses musl, not glibc)
└─ macOS / FreeBSDIf you were running a stable, production distribution, you were almost certainly safe. The attack was caught roughly two weeks before the compromised versions would have propagated to stable channels.
Why This Attack Is Terrifying
The patience
Three years. The attacker spent three years building a reputation, contributing real code, earning trust — all for a single backdoor. This isn't a script kiddie. This is the kind of operation typically associated with nation-state actors.
The single point of failure
One burned-out maintainer controlled a library that sits in the critical path of SSH on most Linux distributions. The attacker exploited a human vulnerability, not a technical one.
The detection was pure luck
Andres Freund wasn't doing a security audit. He was doing performance benchmarking and happened to notice a 500ms anomaly. If the backdoor had been slightly more efficient, it might have gone undetected for months or years.
The build system as attack vector
The malicious code wasn't in the source files that developers read. It was in binary test fixtures, extracted by build scripts during compilation. A git diff wouldn't show you anything suspicious.
# What you'd see reviewing the source:
$ git log --oneline
a1b2c3d Add test files for corrupt archive handling
d4e5f6g Update build configuration for portability
# What you wouldn't see:
# The binary test files contain an encrypted payload
# The build script silently extracts and injects itLessons for the Industry
1. Fund critical infrastructure maintainers
A single volunteer maintained compression software used by billions of devices. That's not sustainable, and it creates exactly the kind of pressure that attackers exploit.
What xz Utils needed:
├─ Multiple maintainers (bus factor > 1)
├─ Financial support for the maintainer
├─ Corporate sponsorship or foundation backing
└─ Regular security audits
What xz Utils had:
└─ One person, volunteering, burned out2. Review binary artifacts and build scripts
Most code review focuses on source code. The xz backdoor was hidden in binary test data and build system modifications — areas that rarely get scrutinized.
# Things to audit that usually get skipped:
- Binary test fixtures (can contain anything)
- Build scripts (m4 macros, autoconf, cmake)
- CI/CD configuration changes
- Vendored dependencies3. Verify reproducible builds
If distributions had reproducible builds, someone could have noticed that the compiled output didn't match what the source code should produce.
4. Be skeptical of social pressure to add maintainers
The sock puppet campaign pressuring Lasse Collin was a key part of the attack. Communities should support maintainers, but granting commit access should involve thorough vetting — not guilt.
5. Monitor for behavioral anomalies
The backdoor was caught because of a performance regression. Automated performance testing and behavioral monitoring on critical system libraries could catch similar issues earlier.
What Changed After XZ
The xz incident accelerated several industry efforts:
- OpenSSF ramped up efforts to identify and fund critical open-source projects
- Linux distributions tightened their review processes for upstream changes
- CISA issued guidance on software supply chain integrity
- Reproducible builds gained more urgency across distributions
- Companies started taking maintainer burnout seriously as a security risk, not just a community problem
Final Thoughts
The xz backdoor wasn't a failure of technology. It was a failure of the ecosystem. A critical piece of infrastructure was maintained by one person, under pressure, with no resources — and a patient attacker exploited every part of that.
The scariest question isn't "how did this happen?" It's "how many other projects look exactly like xz Utils did before March 2024?"
The answer is: thousands. And until the industry takes open-source sustainability as seriously as it takes security scanning, the next xz is just a matter of time.
For practical steps to protect your own infrastructure, read Linux Server Hardening: A Practical Guide for AlmaLinux and Ubuntu.