By Contributor,Kjartan Rist
Copyright forbes
AI is smart, but we’re missing the human layer.
Shutterstock
AI is everywhere. It is composing symphonies, coding apps, and answering your customer support tickets with unnerving politeness. But let me ask you this: if AI is so smart, why can’t it understand your sarcasm when you say “great job” after spilling coffee on your laptop?
We are living in the golden age of intelligence, but street-smart? Still loading…
Nevertheless, the hype is justified. Generative AI is a force that could reshape everything from healthcare to corporate strategy. But while these systems can mimic logic and language, they are still emotionally tone-deaf. Think Bill Gates doing stand-up comedy – impressive, but not quite right.
The AI Value Chain: From Raw Data to Digital Soulmates
If you want to understand the chaos under the AI hood, look at the AI value chain: from data and infrastructure to models, agents, and applications. It’s a bit like making a soufflé with a dozen chefs, each convinced their technique is best.
Start with data: oceans of it, structured and unstructured. Feed it to infrastructure providers like NVIDIA (who, by the way, are selling pickaxes in the modern-day AI gold rush). Then enter the model zone: OpenAI, Anthropic, Mistral. You have got agents layered on top – think GitHub Copilot or Salesforce’s Agentforce – automating workflows and rewriting the productivity playbook.
According to a Goldman Sachs report published in July 2025, there will be a $5 trillion infrastructure bill to support AI’s growth, with hyperscalers like Microsoft and Amazon collectively investing a trillion dollars by 2027 into GPU-powered data centres (p. 4).
MORE FOR YOU
In Texas, the Stargate project (OpenAI + SoftBank + Oracle) plans to build a megastructure powered by ambition and cheap electricity.
We’re no longer in the age of apps. We’re building AI factories.
Early Wins: AI That Works (Mostly)
Let’s not be cynical – AI has real, tangible wins:
Healthcare: Lark Health’s AI nurse exchanged 400 million messages in one year, achieving what would take 15,000 human nurses. Empathetic? Sort of. Scalable? Definitely
Customer Support: Uniphore and Hume AI are pioneering tone-aware interactions that sound (almost) like they care. Why does this matter? Because in the customer service industry, a robotic tone means a cancelled subscription
Enterprise Workflows: GitHub Copilot, Atlassian Rovo, and ServiceNow’s Now Assist are now doing your admin work better and faster than your intern. And with fewer coffee breaks
Yet, in true tech form, we still haven’t cracked the most human layer: emotion.
Emotional Intelligence: Still AI’s Missing Layer
AI may ace your SAT, but it still doesn’t know whether you are angry or just British.
Despite generative models scoring 81% on emotional recognition tests (outperforming humans at 56%), (Schlegel, Sommer, & Mortillaro, 2025), they often fail miserably in the real world. Why? Because real-world emotion isn’t just a box-ticking exercise. It’s context, nuance, culture, and a healthy dose of sarcasm.
Startups like Hume AI are working on empathic LLMs that adjust tone and timing based on voice and facial expression. Ellipsis Health uses voice biomarkers to assess mental health. And Soul Machines? They’re building digital avatars that can smile and frown. Progress, yes. But still uncanny valley territory.
A McKinsey analysis released in January 2025 suggests that companies that prioritise emotion outperform peers by 85% in sales growth (p. 7). According to a study published in JAMA Internal Medicine, ChatGPT responses to patient questions on Reddit’s r/AskDocs were rated as empathetic or very empathetic nearly ten times more empathetic than actual doctors (45.1 % vs. 4.6 %). (No offence, Dr. Bob.)
Victory for the Humans? (Sort of)
Let’s call it: the first set goes to the humans.
Despite all the marvels, Agentic AI is still in nappies. Morgan Stanley suggests we are only at Level 3 of 5 on the autonomy spectrum. Yes, Salesforce’s Agentforce can reconcile a 100-page invoice, but when things get messy, humans still take over.
The challenges? Models are probabilistic, not deterministic. They don’t fail predictably. Observability is now the next big thing. MVP suggests that there is a $64bn opportunity to monitor and debug LLMs like we once did microservices (July 22, 2025, p. 3). Think Snowflake meets therapy.
Even OpenAI admits many models train on synthetic data – a bit like learning to drive by watching Fast & Furious. So, while AI does more, we humans still do the context, the ethics, and the subtle art of not offending clients.
The Future: When AI Gets Street Smart
So, when will AI become street-smart?
Probably not until it can:
Tell the difference between passive aggression and politeness.
Read the room before sending a follow-up email.
Understand that “per my last email” is corporate-speak for war.
But we are getting there. Multimodal AI (combining text, voice and facial data) is closing the gap. CDO reports that Hume AI has trained on data from 30+ countries, making its models more culturally sensitive. Heart rate, tone, and even skin conductivity, are all being used to increase empathy.
We are seeing emotionally aware agents in healthcare, education, and even recruitment. But here’s the rub: empathy is not just about detection. It’s about consequence. Can the machine choose the right response? That’s where AI still leans heavily on its human overlords.
Final Thought: Augmented, Not Replaced
AI isn’t replacing humans. It’s augmenting them. But it needs humans – not just for supervision, but for making sense. As investors, don’t just chase the foundational model race. That’s Red Bull and burn rates. Look at the middleware, the evaluators, the emotional layers. Look at companies building bridges between emotion and intelligence. Between reasoning and rapport.
The layer that helps AI go from intelligent to smart. That might just be the trillion-dollar opportunity no one saw coming.
Because until AI understands why you cried at the end of a Pixar movie, I’m still calling it a glorified calculator….
Editorial StandardsReprints & Permissions