Thank you, I really appreciate that.
Yes, I’ve read The Machine Stops, and it’s hard not to feel it hovering over moments like this. Forster saw the danger early. What he couldn’t have known is how normalized the machine would become, or how willingly we’d narrate its failures and keep feeding it anyway.
My instincts tend to run a bit later. More Pat Cadigan, a little J.G. Ballard. Less catastrophic collapse, more systems that keep functioning long after they stop making human sense. I’m interested in the quiet failure modes, the ones that don’t trip alarms but slowly change how people trust, notice, and relate.
If this landed for you, that’s probably the overlap.
Exactly. Lampshading is the right word for it.
Once acknowledging the problem becomes the whole move, relevance replaces responsibility. The ad doesn’t promise to fix anything. It just proves it knows the vibe. That awareness is treated as absolution.
“AI is scary, but trust our AI” “Work sucks, so automate yourself out of it” “There’s a wealth gap, here’s a checkout button”
None of it is persuasion anymore. It’s alignment theater. The point isn’t to convince you. It’s to make sure you don’t recoil.
And yeah, the He Gets Us ads are a whole separate category of grim. When even moral language is reduced to brand-safe tone, you’re not being spoken to. You’re being processed.
I’ve got a few essays in the drafting stage on moral coercion, how systems use shared values to narrow choices without looking like force. This ad cycle feels like a case study.