When Apple ships Siri via Gemini, that is not a threat. It is validation.

Andy Massey · 25 April 2026 · Hong Kong

Every couple of weeks, someone asks me what I will do when Apple announces the new Siri. The implied question is always: aren't you scared?

I am not. And I want to explain why, because I think the framing most people bring to this is exactly backwards.

What is about to happen

At WWDC in June, or in the weeks leading up to it, Apple is widely expected to announce a significant upgrade to Siri. What has leaked so far points to conversational interaction, deeper app integration, personal-context awareness, and – the interesting bit – a backend partnership with a frontier cloud model. Current reporting names Google Gemini as the partner for "queries that require web search."

When that announcement lands, two things will happen. The tech press will frame it as Apple finally catching up on AI. A subset of founders and investors in the personal-AI space will privately panic, because a company with two billion devices and a trillion-dollar balance sheet has now entered their market.

Both of those reactions are wrong.

What Apple is actually telling you

Apple has spent twenty years positioning itself as the privacy-first platform. "What happens on your iPhone stays on your iPhone." That is not a throwaway tagline. It is the centrepiece of how they differentiate against Google, and they have been willing to take billions of dollars of revenue hits to defend it.

If Apple could build a frontier personal assistant entirely on-device, they would. They have the hardware, the OS, the user base, and the capital. They would not need Google. They would prefer not to need Google. Every dollar they route to Gemini is a dollar of their own margin plus a strategic dependency on a direct competitor.

They are doing it anyway. That tells you something important: even Apple cannot build a genuinely useful personal AI entirely on-device with the tools available to them in 2026. The Apple Intelligence Foundation Models framework, limited to macOS Tahoe and a 4k context window, is not enough. Siri needs more. More reasoning. More context. More flexibility. So Apple is doing the thing they spent two decades promising they would not do, and routing your queries through someone else's servers.

The most privacy-obsessed consumer-tech company on the planet just conceded that they cannot build a personal AI locally. Read that twice. It tells you exactly how hard the problem is, and exactly how large the market is.

If the problem were easy, Apple would have solved it. If the market were small, Apple would not be bothering. The fact that they are compromising on the principle they are most famous for defending is the loudest possible signal that personal AI is the category of the decade.

The cloud-routed category vs the local-first category

Here is the frame I think is more useful than threat-versus-validation.

Personal AI is splitting into two categories. Cloud-routed personal assistants and local-first personal assistants. They look similar from a mile away – both answer questions about your life, both read your messages, both know who your friends are. Under the hood they are entirely different products with entirely different trust models.

Cloud-routed personal AI is Apple with Gemini, Google with its own stack, Perplexity Personal Computer, Poke, Omi, most of the well-funded startups in the space. Their reasoning happens on someone else's hardware. Their memory lives on someone else's database. The value proposition is "the most capable model possible, fed your life." The trust proposition is "we promise to be careful."

Local-first personal AI is Ostler and a small but growing number of others. Reasoning on your machine. Memory on your machine. Network access only for queries you explicitly allow. The value proposition is "good enough intelligence, with an unbreakable trust guarantee." The trust proposition is architectural, not contractual. We cannot leak your data because we do not have it.

Both categories will exist. Both will have real customers. The interesting question is what ratio of the market ends up in each, and how quickly.

Why I think the split goes our way over time

Three reasons.

First, local models keep getting better at a compounding rate. The cost-quality curve for open-weight models has fallen 280x in two years. A 9-billion-parameter model running on a Mac Mini today does what required a cloud call two years ago. Next year it does more. At some point the capability gap between cloud and local shrinks below the threshold where users care, and all that is left is the trust gap, which points the other way.

Second, the first serious privacy breach in the cloud-routed category will reshape the conversation permanently. It has not happened yet. It will. Personal AI products sit on top of the most intimate data any platform has ever aggregated – your messages, your calendar, your medical appointments, your relationships, your half-formed thoughts. The first time a breach or a subpoena or a terms-of-service change exposes that at scale, every cloud-routed personal AI product will have to answer for the architecture choice. The ones that made the other choice will not.

Third, and this is the subtlest one: the things personal AI is good at are increasingly things you do not want a frontier model doing. "Draft a reply to that thread we were on last week" does not benefit from a trillion parameters of internet knowledge. It benefits from knowing how you write and what you were last discussing. That knowledge lives on your device. Cloud routing is actively hostile to it, because every query trip destroys the intimacy of the context.

The confident posture

I do not think the right response to Apple's announcement is to compete on their turf. They have the integration, the distribution, the default. We will not out-Apple Apple on general-purpose cloud-assisted queries, and we should not try.

The right response is to lean harder into the category Apple is implicitly confirming exists but has chosen not to serve. The category where the user says: I want this, I want it good, and I want it to stay on my machine. That is the bet. Apple's announcement, whenever it lands, makes the bet more obvious, not less.

So when the WWDC keynote happens, and Siri does something impressive via a cloud partner, and the tech press talks about how Apple is finally back in the AI race – we will probably write another blog post. The headline is going to be some variant of thank you for validating the category. Because that is what it will be.

The contrarian bet the market is now demanding

Ostler is the local-first personal AI. Runs on your Mac. Stays on your Mac.

Why Ostler?

Agree, disagree, think I have read the leaks wrong – hello@ostler.ai.