blog 2 Minutes
There’s an old story about a village that finally got electricity. Everyone bought fridges. A few months later, the elders gathered and suggested the unthinkable… “get rid of them!”
Before the fridges, leftover food was shared. No one went to bed hungry. After the fridges, leftovers were hoarded “just in case,” forgotten for days, and then thrown out. The tech solved an individual problem and quietly broke a community system that worked.
Cyber security has its own fridge moment.
We adopt shiny tools because they promise speed, scale, and safety. But convenience is rarely neutral. It nudges behaviour, reshapes norms, and quietly relocates risk like a dodgy estate agent moving a crack in the wall behind a strategically placed plant. Just as the fridge made it easier to keep food to yourself, certain tools make it easier to keep decisions to yourself – out of sight, out of shared scrutiny. And that’s where the cracks form. Not in the tech per se, but in the human systems wrapped around it.
Now, layer on the “cognitive debt” problem with AI. You don’t need a PhD to see the pattern. When tools bulldoze the productive friction, the short-term win starts to smell like four-day-old leftovers. AI that delivers perfect summaries is a personal fridge for cognition – brilliant for tonight’s dinner, utterly unhelpful for the long-term.
In security, we’ve gotten into the same trade-off. Paste errors into a model instead of debugging. Ask a copilot to write the IAM policy rather than reason through least privilege. Auto-generate threat models, playbooks, and architecture docs. It all feels efficient until you hit an incident you can’t Google, a prompt you can’t trust, or a system you can’t reason about because the neural pathways never formed. It’s like buying a supercar with no brakes because the salesperson promised “incredible top speed.”
That’s cognitive debt: performance now, competence later.
What does the village parable teach us when applied to cyber?
Convenience changes culture. When it’s trivial to “keep it in the fridge,” we stop sharing. In security terms, that becomes private automations, shadow AI, and one-person operational empires. The risk isn’t only wrong answers; it’s the evaporation of shared awareness and collective resilience. Not to mention a huge single point of failure.
Friction has a purpose. Writing a runbook, walking a teammate through root cause, stepping through the kill chain is what builds knowledge. Remove the friction, remove the learning. Same with AI: if it jumps straight to the answer, your brain never builds the map. It’s the security equivalent of living on protein shakes and calling it cuisine.
Systems beat tools. The villagers weren’t anti-fridge; they were pro-community. Likewise, this isn’t a manifesto against AI or automation—it’s a call to design the surrounding system so the whole ecosystem gets stronger. A practical, human-centred approach is to keep the friction that teaches.
The fridge didn’t make people selfish; it made it easier to behave in ways that chipped away at communal strength. AI won’t make your team reckless; it will make it easier to outsource thinking that used to create shared competence. If we’re not intentional, we’ll wake up with lightning-fast workflows, spotless formats, and a team that falls apart the moment the teleprompter stutters.
The fix isn’t romanticising the past or rejecting new tools. It’s designing for the collective. Keep the convenience, by all means. Just don’t store your capability in it.
Published