We examine the emergence of AI grief technologies, specifically systems designed to simulate the personality and interaction style of deceased individuals. We introduce the concept of Posthumous Continuity Capture, which describes how generative AI creates a “felt continuity” of a relationship by using a person’s digital remains to maintain an ongoing, live interaction.
While these tools offer potential comfort through structured memorials, the sources warn of significant psychological and ethical risks, such as emotional dependency and the commercial exploitation of mourning. We categorize these dangers into a risk gradient, noting that open-ended simulations can cross a line from healthy preservation to spectral labour, where the dead are essentially put back to work as data-driven engagement tools.
To mitigate these harms, we propose a minimum safeguard stack that includes strict consent requirements and the prohibition of advertising within memorial spaces.
Original essay available here










