Last verified on 26 April 2026. Pricing, features, and policies change – check the source if anything looks off.
“Please don’t enter confidential information that you wouldn’t want a reviewer to see.”
– Google’s own privacy documentation for GeminiOstler is the only AI assistant that actually knows your life – every face, every message, every meeting, every email loaded in, all kept on your Mac. In April 2026, Google and OpenAI both launched AI assistants for Mac that do something very different: they read your screen and ship it to corporate servers. Gemini on demand. Codex Chronicle $100+/month, automatically, every few seconds. Both are restricted in privacy-conscious jurisdictions like the EU. Ostler knows more about you – and nothing ever leaves your Mac.
| Ostler | Gemini for Mac | OpenAI Codex | |
|---|---|---|---|
| Where your data lives | Your Mac. Never leaves. | Google’s servers | OpenAI’s servers |
| Screen capture | Never reads your screen | Reads when you share a window | Auto-captures periodically |
| Who reads your data | Nobody. Ever. | Human reviewers at Google | OpenAI staff & contractors |
| Used to train AI | No | Yes (opt-out, not opt-in) | Yes (opt-out for ChatGPT) |
| Available in EU/UK | Yes | Yes | No (blocked in EU/UK/CH) |
| Works offline | Yes | No | No |
| Computer control | Never moves your cursor | No | Operates apps with own cursor |
| Price | $49.99 + $24.99/mo | Free (you are the product) | $100+/mo (Pro tier) |
| Knows your relationships | Knowledge graph of everyone you know | No relationship awareness | No relationship awareness |
| Personal knowledge | 20 platforms imported, searchable | No personal knowledge base | Memory file from screen captures |
| Messaging channels | iMessage, WhatsApp, email | Desktop chat only | Desktop chat only |
| Conversation history | Local, encrypted, yours forever | On Google’s servers | On OpenAI’s servers |
Google’s own support documentation states that human reviewers read a subset of your Gemini conversations. These reviewed conversations are retained for up to three years, even after you delete them from your account. Even if you turn off “Gemini Apps Activity,” your data is still retained for 72 hours “for safety.”
The default setting uses your conversations to train Google’s AI models. You can opt out, but the setting is on by default. Most people will never find it.
And this is what Google themselves recommend: “Don’t enter confidential information that you wouldn’t want a reviewer to see.”
Think about what that means for a personal AI assistant. The whole point of a personal assistant is to help with personal things – your relationships, your calendar, your emails, your life. Google is telling you not to use their personal assistant for anything personal.
According to Google’s documentation, Gemini for Mac collects:
Your prompts and conversations. Every question you ask, every context you share.
Your screen content. When you share a window, Gemini reads what is on your screen and sends it to Google’s servers.
Your contacts and call logs. Connected through your Google account.
Your location. From IP address, Home/Work addresses, or precise GPS.
Your device information. Installed apps, browser settings, device type.
Your search and YouTube history. Connected through your Google account.
All of this is processed on Google’s servers, reviewed by humans, and retained for years.
One day before Google launched Gemini for Mac, OpenAI shipped a Codex update that includes a feature called Chronicle. Chronicle periodically captures your screen automatically – not just when you ask it to. It processes those screenshots into text summaries that build long-term memory about what you have been working on.
The screenshots go to OpenAI’s servers for processing. Reports note that the data is sent without end-to-end encryption. The feature requires a $100+/month Pro subscription and is blocked entirely in the EU, UK, and Switzerland – jurisdictions where regulators take privacy seriously.
Codex also includes a “Computer Use” feature where the AI operates your Mac apps with its own cursor – clicking, typing, navigating between windows. The boundaries of what it can do, and what it sees while doing it, are still being worked out by security researchers.
This is the direction Big Tech is taking personal AI: more access, more capture, more cloud processing, less privacy. Ostler is the opposite direction.
Everything Ostler processes stays on your Mac. The AI runs locally via Ollama. Your databases are encrypted with a passphrase only you know. There is no server. There is no reviewer. There is no retention policy because nobody else has your data.
Ostler does not read your screen. The optional Remote Conversations feature uses your microphone to transcribe meetings – but only when you explicitly start a recording, and the audio never leaves your Mac. Ostler reads only the data sources you explicitly grant it – your contacts, your messages, your calendar – and it never sends any of it anywhere.
To be transparent: Ostler does connect to the internet – but only to pull public information in. It fetches data from Wikidata to enrich person and organisation pages, downloads AI models, and can perform web searches on your behalf. The critical difference is the direction: public data comes in, personal data never goes out. Your contacts, messages, relationships, and knowledge graph never leave your machine. You can unplug your network cable and Ostler still works – you just lose web search and Wikidata enrichment.
Gemini for Mac is free. This is not generosity. Google makes $224 billion per year from advertising, powered by personal data. When a product is free, you are the product. Your conversations, your screen content, your location – all of it feeds the machine that sells ads against your attention.
Ostler costs $24.99/month because it does not monetise your data. The price is the business model. There is no secondary revenue stream from your personal information, no investor pressure to “find ways to monetise the user base.” You pay for the software. That is the entire transaction.
Let us be honest: Gemini is a more capable AI model than what runs locally on your Mac. Google’s cloud infrastructure can run models with hundreds of billions of parameters. Ostler runs models with 5 to 35 billion parameters locally.
But Gemini does not know who your friends are. It cannot tell you when you last spoke to someone. It does not track relationship warmth, reciprocity, or conversation history across platforms. It cannot generate a personal wiki of everyone you know. It is a very intelligent stranger. Ostler is a less intelligent friend who actually knows your life.
The gap in model quality is closing fast. Local models today match where cloud models were 18 months ago. In a year, local models will match today’s cloud. The privacy difference, however, is permanent. Google will never stop collecting your data. That is their business.
Gemini for Mac is a cloud AI with a local interface. It looks like it runs on your Mac, but your data travels to Google, gets read by reviewers, trains their models, and is retained for years.
Ostler is a local AI with local data. Nothing leaves your machine. Your life stays yours. The AI is smaller, but it actually knows who you are – because you trusted it with your data, and it kept it safe.
The question is not which AI is smarter. The question is: who do you trust with the most intimate details of your life?