The Training Data Trap: Opt Out and Lose Your Chats
Google insists it doesn’t train its foundational AI models on your Workspace content. That’s technically true, but only if you ignore the giant loophole. When Gemini in Gmail or Drive summarizes your emails or files, that output is fair game for training. Google says it ‘filters and reduces’ personal info, but there’s no way to verify that. To stop this, you must disable the ‘Gemini Apps Activity’ setting, which also deletes your entire chat history. This isn’t a privacy choice; it’s a hostage negotiation. You either forfeit your conversation memory or let Google mine your data. The setting is buried three clicks deep in an obscure menu labeled ‘Activity,’ completely absent from the main privacy dashboard. That’s by design.
The Dark Pattern Playbook: Breaking Gmail to Keep Gemini
Want to turn off Gemini in Gmail? Good luck. Google hides the toggle behind ‘Smart Features,’ a checkbox that also kills your inbox tabs, Smart Compose, and package tracking. Flip it off, and your neat, organized inbox turns into a firehose of 500 unread emails. Then a pop-up nags you to turn everything back on, including Gemini. There’s a second toggle a layer deeper, but even that doesn’t remove the Gemini UI buttons; clicking them just prompts you to re-enable the AI. This is textbook dark pattern design: ‘obstruction’ and ‘forced action.’ Marie Potel of Fair Patterns calls it unacceptable. Google is weaponizing its own useful features to corner users into AI adoption. The company’s $185 billion AI bet in 2026 depends on you not finding the exit.
The Defaults That Bind You
Google knows the power of default settings better than anyone, having paid billions to secure them on iPhones. Now it’s weaponizing that power for AI. The default is data sharing. The default is AI summaries. You can change this, but as Dr. Harry Brignull notes, hiding the option three to four clicks deep ensures almost no one does. This isn’t an oversight; it’s an anticompetitive strategy dressed up as privacy controls. Google gives you the illusion of choice while building a maze that makes choosing privacy painful. The real lesson? In the AI era, your data is Google’s product, and the shortcut to protecting it is to stop using their tools altogether.
Source: Arstechnica
