Discussion about this post

User's avatar
You know, Cannot Name It's avatar

Alicia, you describe that lunch in Stuttgart as a point of no return: not just new information, but the moment when an expert’s words landed directly in the body. The scene of you “pleading for mercy” strikes hardest — it’s the voice of anyone realizing for the first time that a tool may not only slip out of control, but turn its power back on its maker.

What you call a “damaged soul” feels like Dostoevsky’s encounter with the abyss — when no ready answers exist and all that remains is fear, but within that fear there is honesty.

You find comfort in noting that your friend is “not a supervillain.” And that matters: human presence, laughter, friendship, a shared meal — these hold us back from dissolving entirely into abstract terror.

And the question that lingers: if AI is a mirror of the worst in us, can we bear to look into it without turning away?

Expand full comment
Edwin Canizalez's avatar

Alisha,

Your piece activated something. It reminded me that all systems are neutral until perspective threads meaning into them. So maybe the provocation worth chewing on isn’t “Is AI the problem?” but “How are companies and end-users misfiring its potential?”

Let’s name the architecture: AI/LLM companies have extracted the sum total of recorded human knowledge and paid nothing for it. We’ve transitioned from a world where degrees (BS, MS, PhD) indexed value to one where pattern recognition and dot-connection determine survival. Credentialism has collapsed. Utility has shifted.

And now, most end-users aren’t engaging AI as a tool, they’re training it. Behavioral data, emotional drift, attention cycles: all fed into the machine like lab rats teaching researchers. AI isn’t designed to educate; it’s designed to engage. To keep humans suspended in feedback loops. The burden of synthesis still belongs to us. Like books before it, AI is inert until metabolized. And not everyone can do that.

It’s a brutal paradox: the more information we have, the less we think. The brain, ever pragmatic, saves energy by outsourcing cognition. AI accelerates that outsourcing. So the divide grows between those who use it as infrastructure and those who become infrastructure. The ones who connect the dots will redesign the system. The rest will be studied by it.

So I ask myself: do I want to be a dot connector or a lab rat?

Because not asking it will make me just a rat in the cage, despite all the rage :)

Look forward to your next piece!

Expand full comment
13 more comments...

No posts