Copyright theconversation

Artificial intelligence (AI) is increasingly being used to preserve the voices and stories of the dead. From text-based chatbots that mimic loved ones to voice avatars that let you “speak” with the deceased, a growing digital afterlife industry promises to make memory interactive, and, in some cases, eternal. In our research, recently published in Memory, Mind & Media, we explored what happens when remembering the dead is left to an algorithm. We even tried talking to digital versions of ourselves to find out. “Deathbots” are AI systems designed to simulate the voices, speech patterns and personalities of the deceased. They draw on a person’s digital traces – voice recordings, text messages, emails and social media posts – to create interactive avatars that appear to “speak” from beyond the grave. As the media theorist Simone Natale has said, these “technologies of illusion” have deep roots in spiritualist traditions. But AI makes them far more convincing, and commercially viable. Our work is part of a project called Synthetic Pasts, which explores the impact technology has on the preservation of personal and collective memory. For our study, we looked at services that claim to preserve or recreate a person’s voice, memories or digital presence using AI. To understand how they work, we became our own test subjects. We uploaded our own videos, messages and voice notes, creating “digital doubles” of ourselves. In some cases, we played the role of users preparing our own synthetic afterlives. In others, we acted as the bereaved trying to talk to a digital version of someone who has passed away. What we found was both fascinating and unsettling. Some systems focus on preserving memory. They help users record and store personal stories, organised by theme, such as childhood, family or advice for loved ones. AI then indexes the content and guides people through it, like a searchable archive. Others use generative AI to create ongoing conversations. You upload data about a deceased person – messages, posts, even voice samples – and the system builds a chatbot that can respond in their tone and style. It uses a subset of AI called machine learning (which improves through practice) to make its avatars evolve over time. Some present themselves as playful (“host a séance with AI”), but the experience can feel eerily intimate. All the platforms claim to offer an “authentic” emotional connection. Yet the more personalisation we attempted, the more artificial it felt. When prompted by these systems we provided more information about ourselves, only to have the bot repeat the exact phrasing we had used in stiff, scripted replies. At times, the tone was incongruous, such as when cheerful emojis or upbeat phrasing appeared even when discussing death – a clear reminder that algorithms are poor at handling the emotional weight of loss: Oh hun… 😔 it (the death) is not something I’d wish for anyone to dwell on. It’s all a bit foggy now, to be honest. 🌫️ Let’s chat about something a bit cheerier, yeah? The more archival-based tools we tested offered a calmer experience but also imposed rigid categories and limited nuance. As digital media scholar Andrew Hoskins has observed, memory in the age of AI becomes “conversational” – shaped by interactions between human and machine. But in our experiments, those conversations often felt flat, exposing the limits of synthetic intimacy: Human: You were always so encouraging and supportive. I miss you. Deathbot: I’m right here for you, always ready to offer encouragement and support whenever you need it. And I miss you too… Let’s take on today together, with positivity and strength. Behind these experiences lies a business model. These are not memorial charities, they are tech start ups. Subscription fees, “freemium” tiers and partnerships with insurers or care providers reveal how remembrance is being turned into a product. As the philosophers Carl Öhman and Luciano Floridi have argued, the digital afterlife industry operates within a “political economy of death”, where data continues to generate value long after a person’s life ends. Platforms encourage users to “capture their story forever”, but they also harvest emotional and biometric data to keep engagement high. Memory becomes a service – an interaction to be designed, measured and monetised. This, as the professor of technology and society Andrew McStay has shown, is part of a wider “emotional AI” economy. Digital resurrection? The promise of these systems is a kind of resurrection – the reanimation of the dead through data. They offer to return voices, gestures and personalities, not as memories recalled but as presences simulated in real time. This kind of “algorithmic empathy” can be persuasive, even moving, yet it exists within the limits of code, and quietly alters the experience of remembering, smoothing away the ambiguity and contradiction. These platforms demonstrate a tension between archival and generative forms of memory. All platforms, though, normalise certain ways of remembering, placing privilege on continuity, coherence and emotional responsiveness, while also producing new, data-driven forms of personhood. As the media theorist Wendy Chun has observed, digital technologies often conflate “storage” with “memory”, promising perfect recall while erasing the role of forgetting – the absence that makes both mourning and remembering possible. In this sense, digital resurrection risks misunderstanding death itself: replacing the finality of loss with the endless availability of simulation, where the dead are always present, interactive and updated. AI can help preserve stories and voices, but it cannot replicate the living complexity of a person or a relationship. The “synthetic afterlives” we encountered are compelling precisely because they fail. They remind us that memory is relational, contextual and not programmable. Our study suggests that while you can talk to the dead with AI, what you hear back reveals more about the technologies and platforms that profit from memory – and about ourselves – than about the ghosts they claim we can talk to.