top of page

From Disney Channel to Digital Resurrection

Photo by Becky Fantham on Unsplash
Photo by Becky Fantham on Unsplash

Background

The newest entrant into the generative AI space is 2wai (pronounced “two-way”), a social networking app that allows users to create lifelike, interactive digital avatars called “HoloAvatars” of themselves or other people. Former Disney Channel star Calum Worthy, co-founder of 2wai, along with Hollywood producer Russell Geyser, announced the launch of their platform on X this past week. On its website, 2wai defines itself as “the first social network for human avatars” and markets HoloAvatar as a digital twin that “looks and talks like you, and even shares the same memories,” created with your phone’s camera and guided steps from the app. HoloAvatars run on 2Wai’s proprietary FedBrain technology, which processes interactions on-device to ensure privacy and limit responses to user-approved data, reducing AI hallucinations.


The founders highlight the numerous uses for the platform, emphasizing its use for creators and celebrities to serve fans and customers at any time and in any language. As a former child star, Worthy stated that he started this platform because he experienced “how challenging it is to create a meaningful relationship with fans around the world,” and that the app allows creators to “engage fans 24-7 without needing to be online all the time.” 


Though these purposes may have been their intention in starting the app, their official launch video flaunts a more forward-looking and arguably dystopian purpose: enabling users to create avatars of deceased loved ones from just a few minutes of recorded video, voice notes, and other data. While the company promotes this as a way to preserve memories and create a “living archive of humanity,” critics have been quick to compare it to the popular Netflix show, Black Mirror, sparking a widespread debate about the ethics of using AI to simulate interactions with the deceased, raising questions about consent, the grieving process, and the commercialization of loss.


The underlying ethical conundrum is clear: people want digital memorialization, but 2wai offers interactive conversations with the deceased with the click of a download button. This trend of “digital resurrection” is both rapidly emerging and rapidly controversial, raising the question as to who, legally or ethically, can consent on behalf of the deceased.


But 2wai isn’t entering an empty field. HereAfter AI builds “Life Story Avatars” from pre-death interviews, with an emphasis on pre-mortem consent. StoryFile offered interactive videos from recorded sessions, though the company filed for Chapter 11 bankruptcy in 2024. Replika lets users mimic the deceased via text or calls, but faced backlash after a 2023 update “killed” personalized chatbots, and a Belgian man’s 2023 suicide was linked to chats with it.


And while no federal rules govern digital resurrection, these companies face an uphill battle in privacy and intellectual property law to clarify this legal gray area.


Legal Concerns

While digital resurrection technology has existed for years, the legal landscape surrounding this issue is almost entirely unmapped. There are three key implications with this development: consent, publicity and likeness rights, and data privacy.


Consent

The underlying question behind digital resurrection is who has the right to speak for the dead. While this question may be simple, the answer is far from it.


For the living, consent is governed by privacy statutes and contract principles. For the deceased, none of these cleanly applies. This creates a legal vacuum, where companies such as 2wai rely on user-uploaded materials without verifying whether the deceased ever agreed to posthumous simulation. 


On top of this, national statutory inconsistencies make this field harder to navigate. Some states, such as California, allow the holder of postmortem likeness rights (such as an executor of next of kin) to consent to digital replicas of the voice or likeness of a deceased personality. Additionally, California’s AB 1836 bans specifically AI replicas of deceased performers’ voices or visuals in audiovisual works without estate consent, with penalties up to $10,000 or actual damages. Other states consider these rights as extinguished at death. Even with consent from the holder of these likeness rights, this permission may still conflict with the decedent’s pre-mortem interests. Creating a digital replica of a deceased person raises the question of whether silence during life equals non-consent after death.


Likeness

If consent establishes whether an avatar should exist, intellectual property and right of publicity laws govern who can profit from it. And this is where 2wai steps into an extremely fragmented legal landscape.


The right of publicity, which protects one’s name, likeness, voice, and other aspects of their persona, is recognized in over half of the US states. California and Tennessee, for example, provide strong protections over an individual’s right to control the commercial use of their name, photograph, and likeness, including in AI recreations. If a person dies domiciled in Idaho, however, the use of their likeness for commercial purposes is fair game. This creates forum-shopping incentives and exposes companies like 2wai to unpredictable multi-state risk.


And these concerns barely scratch the surface of the biggest issue with likeness, which is what happens when a user uploads content of a well-known celebrity and asks 2wai to generate their HoloAvatar. As courts are more critical of the right of publicity for celebrities, 2wai could face liability for allowing users to create even synthesized likenesses. 


Data Privacy

Digital resurrection requires raw data. Whether it be photos, videos, or voice recordings, 2wai’s FedBrain technology requires some amount of material to convincingly reconstruct a person. But this path from data to avatar is wholly unknown.


While 2wai claims that their model processes interactions on-device to ensure privacy, it lacks meaningful transparency on how their model is built or what sources are included. This matters for a multitude of reasons. First, privacy rights do not expressly terminate at death. An individual's health information, for example, is protected for 50 years after death under HIPAA. Second, the use of biometric identifiers (such as voiceprints and facial geometry) from uploaded materials subjects 2wai to biometric privacy acts, such as Illinois’ BIPA. Finally, uploaded materials will inevitably contain third-party data. A video of a deceased father may also contain the voices and faces of siblings, spouses, or minors, each of which have their own independent privacy rights. The person uploading this data cannot unilaterally waive those rights for others, whether they are living or deceased. 


Without a clear federal privacy framework governing posthumous data use, 2wai must navigate a patchwork of state laws and common-law duties. And because training data often cannot be disentangled once incorporated into a model, a single improper upload can contaminate the entire system.


Business 

Photo apps, AI companions, and memorialization websites have already proven that grief is a powerful market driver. But monteizing grief is always fraught. Whether 2wai intends for their user base to be celebrities engaging with fans or grieving family members memorializing loved ones, they risk appearing exploitative. Public reaction since its launch has been volatile, with people calling it “demonic” and “nightmare fuel.” 


The general public reaction hasn’t appeared to stop 2wai, however. The company is currently offering its app in beta mode for free on the App Store, with plans to shift to a subscription model in the future. 


In a field defined by ethical gray zones, 2wai may be able to embrace clarity as its competitive differentiator. By implementing strict consent protocols, transparent data use disclosures, and opt-in guardrails that go beyond current legal requirements, 2wai may just be able to set the standard for digital memorialization rather than getting caught up in it.


*The views expressed in this article do not represent the views of Santa Clara University.

Comments


bottom of page