Blog

Notes from our founder on local-first personal AI, architecture, and the industry moves that keep confirming the bet. Written in Hong Kong.

The diplomat, the researcher, and the founder: three independent verdicts on local-first personal AI

Three completely different vantage points have converged on the same architecture for personal AI over the past six months. Singapore's Foreign Minister using it daily on a Raspberry Pi. Andrej Karpathy describing it on stage. A founder shipping it to customers. They had no reason to agree. They did anyway.

When Apple ships Siri via Gemini, that is not a threat. It is validation.

Apple is about to concede the category. The fact that the most privacy-obsessed consumer-tech company on the planet cannot build a personal AI locally tells you exactly how large the market is, and why the local-first bet is the contrarian trade now being demanded.

OpenAI shipped an open-weight PII model. We are wiring it in.

On 21 April 2026, OpenAI released Privacy Filter as open weights under Apache 2.0. It runs locally, detects eight categories of PII, and slots directly into Ostler's ingest, diagnostic, and pre-flight pipeline. Here is why, and what the release signals.

Karpathy described the architecture. We already built it.

On Dwarkesh Patel's podcast on 17 October 2025, Andrej Karpathy argued that a small reasoner with external memory beats a 1.8-trillion-parameter monolith. That is the architecture Ostler has been running since late 2025. Here is what it means for local-first personal AI.

Why I built a personal AI that never touches the cloud

After twenty years of giving my data to tech companies, I built a personal knowledge graph that runs entirely on a Mac Mini. This is the story of how it got here, and why it matters that nothing leaves the house.