Google Project Zero published a full 0-click exploit chain for the Pixel 10 on May 13, 2026. The chain reaches arbitrary kernel read/write without user interaction — no tap, no open, no confirmation — and the core primitive that makes it possible took 5 lines of code to write. The authors, Jann Horn and Seth Jenkins, spent two hours auditing the driver before finding the flaw. The patch arrived 71 days after the report.
This is a write-up of how the chain works, what it means for Android's security architecture, and what the 71-day number actually tells you about vendor patch timelines.
The primary source is the Project Zero article itself. All technical claims below are drawn from that article or the Cyber Kendra coverage of it.
The two-stage chain
The exploit has two distinct stages. Neither stage alone achieves the goal. Together they get from "phone receives an audio message" to "attacker controls the kernel."
Stage 1: Dolby UDC — CVE-2025-54957
The entry point is a vulnerability in the Dolby Unified Decoder Component (UDC), tracked as CVE-2025-54957. The same flaw was used in Project Zero's earlier Pixel 9 exploit chain, published in January 2026. It remained exploitable on Pixel 10 because the underlying component had not been updated.
The mechanism is what makes this zero-click: Google Messages automatically transcribes incoming audio messages before the user opens them. That transcription runs through the Dolby UDC. An attacker sends a crafted audio message. The target device processes it in the background. No interaction is required. The attacker now has code execution in the Dolby UDC process.
That is a constrained starting position — not root, not kernel. Stage 2 handles the escalation.
Stage 2: VPU driver — the Tensor G5 mmap bug
The Pixel 10 runs on Google's Tensor G5 chip, which includes a Chips&Media Wave677DV video decoder. This decoder is exposed to userspace through a new /dev/vpu device interface introduced with the Tensor G5. The driver that manages this interface, the VPU driver, contains a memory mapping flaw.
The relevant handler is vpu_mmap. When userspace asks to map the VPU's hardware registers into its address space, the driver calls remap_pfn_range — a standard kernel function for mapping physical memory pages into a VMA (virtual memory area). The bug is that remap_pfn_range is called with a size derived purely from the size of the VMA as requested by userspace, with no bounds check against the actual size of the VPU register region.
Userspace can request a mapping that is much larger than the register region. The kernel will map it. What comes after the register region in physical memory? Everything else. Including the kernel image itself.
On Pixel devices, the kernel is loaded at a fixed physical address. There is no physical KASLR. Once the attacker has a mapping that covers the kernel's physical location, they have arbitrary physical memory access — and arbitrary kernel read/write follows directly.
The entire exploit primitive, from opening the VPU device to having arbitrary kernel access, required 5 lines of code. Horn and Jenkins found the driver flaw in two hours of manual audit.
Why this driver exists in this form
The upstream Linux kernel integrates the same Chips&Media video IP through the V4L2 framework. The V4L2 integration does not expose the hardware MMIO register interface directly to userspace. The Pixel 10 driver takes a different approach: it exposes the raw hardware interface. That is what created the attack surface.
This matters because it is not primarily a bug in the Chips&Media hardware or in the upstream kernel. It is a bug in the Pixel-specific driver that Google wrote to integrate the hardware into Android. The attack surface was created by the integration layer, not the component being integrated.
This is the second consecutive Pixel generation where Project Zero has found a high-severity driver vulnerability in a newly introduced hardware component. The Pixel 9 had the BigWave driver, rated "Moderate" by Android VRP. The Pixel 10 VPU driver was rated "High." The pattern is: new hardware, new driver, new attack surface that did not exist to be audited before the hardware shipped.
The 71-day number
The VPU vulnerability was reported on November 24, 2025. The patch shipped in the February 2026 Android security bulletin. That is 71 days.
Project Zero's standard disclosure window is 90 days. Patching in 71 days means the vendor moved faster than the deadline — something Project Zero notes is "the first time that an Android driver bug reported was patched within 90 days." Read that again: the first time. Typical Android driver patch timelines have historically exceeded the 90-day disclosure window.
For comparison, the Pixel 9's Dolby UDC vulnerability (CVE-2025-54957, the same flaw reused in stage 1 of this chain) took 82 days to patch. The Pixel 10 VPU bug patched in 71. The Dolby UDC bug from the Pixel 9 chain remained alive long enough to be reused in the Pixel 10 chain — because the component had not been updated on the newer device.
This gives you two data points in the same article:
- A driver-level kernel bug: 71 days
- A component-level bug carried forward to a new device: still present at the time of exploitation
The patch window for your fleet is the security patch level (SPL) date on the device. Any Pixel device running December 2025 SPL or earlier is unpatched against both stages of this chain. The February 2026 SPL closes both.
What zero-click actually means for your threat model
Zero-click is used loosely. In this case it means exactly what it says: the attack executes when the message is received and processed by the system, before any user action. The trigger is the transcription pipeline, not the user.
For an enterprise fleet running Pixel devices, this threat model is not theoretical. The attacker needs to know a phone number or a Google account that can receive Messages. They send an audio message. They wait. The rest is automated.
The mitigating factor is that this chain is not weaponized in the wild — Project Zero published it as a research demonstration, not as a forensic analysis of active attacks. But the chain is now documented in a public article with enough specificity that the gap between "research" and "tooled attack" is not large.
The practical risk for a managed fleet before February 2026 SPL: real. After February 2026 SPL: both stages are patched.
What this says about Android's driver security posture
Android's security model has a structural tension that this chain illustrates cleanly. The kernel is the security boundary that matters most. The drivers that get closest to the kernel are also the ones most likely to be written by teams under hardware-launch pressure, on tight timelines, without the security review depth that a component like the Linux V4L2 subsystem has accumulated over years.
The VPU driver was audited manually by two researchers in two hours and yielded a critical flaw. That is not a statement about the competence of the driver authors — two hours of audit by Jann Horn is not a fair benchmark. It is a statement about the review the driver received before it shipped on a consumer device.
The upstream V4L2 integration exists, handles the same hardware class, and does not expose raw MMIO to userspace. The choice to write a custom driver instead of using the upstream integration introduced an attack surface that the upstream approach does not have. Whether that trade-off was made consciously or by default is not something the Project Zero article addresses.
What I take from this: the security gap in Android driver land is not primarily about individual bugs. It is about the surface area created by custom integration layers for new hardware, and about how much audit that surface receives before it ships. Two hours is a good lower bound for what a driver like this needs. Whether it got that before the Pixel 10 launched is not documented.
What to do
If you manage Pixel devices:
- Check the security patch level: Settings → About phone → Android version → Android security update. You want February 2026 or later.
- If you are running MDM (Android Enterprise, Intune, JAMF), enforce the February 2026 SPL as a compliance requirement. Devices below it should not have access to sensitive resources until updated.
- The attack surface here is Google Messages' automatic transcription. There is no practical way to disable automatic message processing on a managed Android device without restricting Messages itself. The patch is the fix.
The update path is straightforward for devices still receiving security updates. The problem is the tail: Pixel devices that have aged out of the support window, corporate-issued devices on slow update approval cycles, and personal devices where the user has not tapped "install."
For any Pixel fleet, February 2026 SPL is the line. Devices below it are exposed to a publicly documented, technically complete exploit chain that requires zero user interaction to trigger.
Read the full Project Zero write-up for the complete exploit code, the responsible disclosure timeline, and the comparison to the Pixel 9 chain.