Chi Sono Libri Podcast Servizi Risultati Premi Stampa Fuori dai denti ENES
The healthy role of AI in 2026: buffer toward humans, never substitute
AI Opinion

The healthy role of AI in 2026: buffer toward humans, never substitute

April 19, 20265 min read

TL;DR — I've figured out an empirical rule about AI's healthy role in our lives. AI can be extraordinary at lowering the friction toward other people and professionals. This must be its role — not the replacement of the human, but the buffer that fills the infinite frictions capitalism puts between us and the people we need. It's fine to vent at 3am if you have a therapist on Tuesday. It's fine for a marketing idea if you then rely on a serious professional.

After months of using AI and having my teams use it, testing models, watching the good and the bad it produces on people, I've arrived at a generalisation that seems to hold. The only healthy role, I mean. Everything else — replacing humans, becoming daily company, substituting for real relationships — is crooked, even if today it's crooked in a fashionable way.

Here's the empirical rule: AI can be extraordinary at lowering friction toward other people and professionals. This must be its role. Nothing else. Nothing less, nothing more.

What I mean by "friction"

Society — especially the advanced capitalist one we live in — has infinite frictions that activate precisely when you'd need people. Concrete examples:

These frictions are real. They're nobody's fault. A professional can't and shouldn't be up 24/7 for you. A friend has their own life. A doctor has hours. An agency closes at 7pm. Yet the need comes when it comes, and it's not always patient.

I, your friends, any professional can't always be there. Capitalism and society have infinite frictions. AI can ease the problem. Act as buffer. If then there's a human.

Three examples, three healthy uses

Example 1: 3am anxiety. It's fine if at 3am you wake up in a vortex of loneliness and talk to an AI for twenty minutes, to vent, to name what you feel, to not go crazy alone. As long as on Tuesday you actually go to your therapist. And this week you make an effort to see a friend.

Example 2: the business idea. It's fine if you have a marketing idea at 10pm on Saturday and you develop it with an AI to see if it holds, to outline it, to anticipate objections. As long as on Monday you turn to a real professional to make it something serious. An AI doesn't know you, doesn't know your market, has no responsibility for the outcome. An agency does.

Example 3: medical tests. It's fine if you check the trend of your blood test results over the years with an AI, if you're afraid of something, if you want to understand what to ask the doctor. As long as next month you actually go to the specialist. Don't self-diagnose with ChatGPT. Don't do AI therapy.

Why "AI only" is a trap

The problem with AI as a total replacement is not the quality of the answer — often it's surprisingly good. The problem is structural:

1. AI doesn't have long-term memory of you. Even systems with memory are closed boxes: they don't really know you. They don't know you're still shaken by the 2024 breakup, that your father had that issue, that you tend to lie when stressed. A professional who follows you over time does.

2. AI has no skin in the game. If it gives you bad advice, it loses nothing. No professional liability, no licence suspension, no damages. Your specialist risks a career if they give you a risky opinion. AI risks zero. This asymmetry matters.

3. AI doesn't replace presence. There are things — a hand on the shoulder, a look that recognises you, a shared silence — that a language model doesn't replicate. And the lack of presence, over time, is a nutritional deficit of the psyche.

The rule in practice

The useful question, before opening the AI chat, is this: am I using AI to get closer to a human, or to avoid having to look for a human?

If AI is a buffer — it eases my urgency until I reach the human I need — fine. If AI is a surrogate — I no longer look for the human because AI is enough — not fine. It's the same difference between chewing gum when you're hungry while travelling, and chewing gum instead of eating.

Context matters

One more thing, to be honest: AI works better in some areas than others, and its buffer role changes accordingly.

Conclusion

AI is the biggest social shock absorber of our era. You use it to soften the blow when the system doesn't respond to you in time. But if you start living inside it, you stop asking the system to improve — and the system stops improving.

Act as buffer. Seek the human. Repeat.

Continue reading

Share
Torna a Fuori dai denti