
This morning, I tried to power down my Samsung S23 smartphone.
I long-pressed the side key expecting the usual “Power off / Restart” menu. Instead, a small Gemini prompt window appeared towards the bottom of my screen.
That small moment stopped me. Instead of powering down, I had activated Google’s AI assistant. That wasn’t at all what I intended. But it’s what Android and Samsung decided I needed.
What sprung to my mind was one Edward Snowden and the summer of 2013. Snowden ignited public outrage by exposing the vast scale of government surveillance. Back then, the concern was interception — governments compelling access to data flows, often with quiet cooperation from tech giants.
What’s happening now is harder to see. Surveillance hasn’t vanished — it has mutated. It no longer hides in cables or server racks. It lives in interfaces. It rides in your hand.
Remapped without asking
That side key used to mean power and reset — a hardware-level command the user controlled. Now, without notice, it’s been remapped to summon an AI system that listens, interprets, and transmits.
Samsung and Android decided that the S23 side key best serves users by directing us to Gemini’s voice prompting window – and thus Google’s encroaching architecture of engagement in the super?heated AI race.
This is mission creep by design. A utility redefined as a capture point. Gemini doesn’t just respond; it mediates. It reframes. It nudges. Once your phone behaves this way, you’re not issuing commands — you’re interacting on terms you didn’t set.
Which brings us to a deeper concern: manipulation capacity.
Once an AI layer inserts itself between your prompt and the system’s response, the potential shifts. The assistant becomes a gatekeeper. Some facts rise. Others recede. The framing changes. Over time, that framing becomes habit — a pattern shaped less by your inquiry than by the platform’s incentives.
Colonizing the user interface
This isn’t just about surveillance. It’s about control of intent. A device that interprets your voice can begin to shape your thoughts. What you meant becomes what it heard. What you hear back becomes the new baseline.
That’s not wiretapping. It’s interface colonization.
Snowden understood this instinctively. What gets hard-coded into platforms becomes the new default. And once defaults shift, they rarely shift back.
Apple’s 2016 standoff with the FBI over a locked iPhone made the stakes clear: build a backdoor once, and the expectation never leaves. Apple resisted — citing long-term consequences for every user.
But now we’re not being compelled by courts. We’re being conditioned by convenience. Side-key remapping isn’t a one-off feature. It’s a signal. A quiet narrowing of user control, delivered as an upgrade.
Manipulating users
The architecture is already changing. Tech Crunch recently reported on how Gemini stores conversations for up to 72 hours, even with history disabled. Some may be flagged for human review. Google is adding memory features that retain personal context automatically. And recent Android updates let Gemini access apps like Messages and Phone, even when certain privacy toggles are off.
Users report Gemini activating unprompted. Security researchers have flagged input vulnerabilities — including silent command injection via hidden characters. Google has downplayed these risks, framing them as “social engineering.”
None of these developments on their own are catastrophic. But together, they describe a structural drift — a gradual rewriting of what basic user interactions mean.
This matters because interface is the new locus of power. What you can ask. How you ask it. Who decides what you hear back. These are no longer UX decisions — they’re systems of influence.
Outrage isn’t a thing anymore
When the gatekeeper is a platform with commercial and geopolitical entanglements, the potential for manipulation scales. Queries may be reframed differently by region, topic, or policy sensitivity — without the user ever knowing. That’s not just a privacy risk. That’s a trust risk.
Outrage isn’t a thing anymore. Big Tech blew past the creepy line years ago — now it’s crossing the trust line. What’s left is the need for clarity: transparent defaults, assertive control, and the unambiguous right to turn your own device off.
But where will that clarity come from? Not from regulators, who are too slow. Not from platforms, who are too conflicted. It’ll have to come from somewhere else — from a new kind of pressure. Quiet, broad, and hard to ignore.
I’ll keep watching, and keep reporting.

Acohido
Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.
(Editor’s note: I used ChatGPT-4o to accelerate and refine research, assist in distilling complex observations, and serve as a tightly controlled drafting instrument, applied iteratively under my direction. The analysis, conclusions, and the final wordsmithing of the published text are entirely my own.)
October 28th, 2025 | My Take | Top Stories