Given enough data, one can feel like it’s possible to keep dead loved ones alive. With ChatGPT and other powerful large language models, it is feasible to create a more convincing chatbot of a dead person. But doing so, especially in the face of scarce resources and inevitable decay, ignores the massive amounts of labor that go into keeping the dead alive online.
Someone always has to do the hard work of maintaining automated systems, as demonstrated by the overworked and underpaid annotators and content moderators behind generative AI, and this is also true where replicas of the dead are concerned. From managing a digital estate after gathering passwords and account information, to navigating a slowly-decaying inherited smart home, digital death care practices require significant upkeep. Content creators depend on the backend labor of caregivers and a network of human and nonhuman entities, from specific operating systems and devices to server farms, to keep digital heirlooms alive across generations. Updating formats and keeping those electronic records searchable, usable, and accessible requires labor, energy, and time. This is a problem for archivists and institutions, but also for individuals who might want to preserve the digital belongings of their dead kin.
And even with all of this effort, devices, formats, and websites also die, just as we frail humans do. Despite the fantasy of an automated home that can run itself in perpetuity or a website that can survive for centuries, planned obsolescence means these systems will most certainly decay. As people tasked with maintaining the digital belongings of dead loved ones can attest, there is a stark difference between what people think they want, or what they expect others to do, and the reality of what it means to help technologies persist over time. The mortality of both people and technology means that these systems will ultimately stop working.
Early attempts to create AI-backed replicas of dead humans certainly bear this out. Intellitar’s Virtual Eternity, based in Scottsdale, Arizona, launched in 2008 and used images and speech patterns to simulate a human’s personality, perhaps filling in for someone at a business meeting or chatting with grieving loved ones after a person’s death. Writing for CNET, a reviewer dubbed Intellitar the product “most likely to make children cry.” But soon after the company went under in 2012, its website disappeared. LifeNaut, a project backed by the transhumanist organization Terasem—which is also known for creating BINA48, a robotic version of Bina Aspen, the wife of Terasem’s founder—will purportedly combine genetic and biometric information with personal datastreams to simulate a full-fledged human being once technology makes it possible to do so. But the project’s site itself relies on outmoded Flash software, indicating that the true promise of digital immortality is likely far off and will require updates along the way.
With generative AI, there is speculation that we might be able to create even more convincing facsimiles of humans, including dead ones. But this requires vast resources, including raw materials, water, and energy, pointing to the folly of maintaining chatbots of the dead in the face of catastrophic climate change. It also has astronomical financial costs: ChatGPT purportedly costs $700,000 a day to maintain, and will bankrupt OpenAI by 2024. This is not a sustainable model for immortality.
There is also the question of who should have the authority to create these replicas in the first place: a close family member, an employer, a company? Not everyone would want to be reincarnated as a chatbot. In a 2021 piece for the San Francisco Chronicle, the journalist Jason Fagone recounts the story of a man named Joshua Barbeau who produced a chatbot version of his long-dead fiancée Jessica using OpenAI’s GPT-3. It was a way for him to cope with death and grief, but it also kept him invested in a close romantic relationship with a person who was no longer alive. This was also not the way that Jessica’s other loved ones wanted to remember her; family members opted not to interact with the chatbot.