Chatbots: A way to immortalize the dead?
28 February 2022
When Microsoft secured a patent for conversational chatbots to mine data from a person’s social media history, observers wondered how it might affect the person’s intellectual property rights.
Season two of the sci-fi series Black Mirror, which premiered on Netflix almost eight years ago, had an unsettling episode about grief. The show introduced audiences to Martha, a young woman grieving the death of her boyfriend, Ash, in a car accident. Soon, Martha learns about a digital service that enables her to connect with a chatbot version of her late partner at his funeral. She agrees to it, although reluctantly.
Interestingly, the digital service used on the show can now apparently work in real life. In January 2022, Microsoft secured a patent for an app that could effectively reincarnate people as a chatbot. The machine tech behemoth patented “conversational” chatbots based on a single human, living or dead.
Like the app on Black Mirror, the technology will collect data from a person’s social media posts and text messages.
Creepy, isn’t it?
Andrew Cobden, counsel at Hogan Lovells in Hong Kong, explains: With the invention directed at creating a conversational chatbot of a specific person interacting with a human user emulating the personality of a particular person, it takes information about a specific person from social data (images, voice data, social media posts, electronic messages, written letters) in order to formulate responses to a human user’s input.
“The specific person being emulated could be a friend or a relative of the human user, a celebrity, a fictional character or a historical figure,” he says. “The social data is used to create a personalized chat index that contains personality information about the person. The index is used to train the chatbot to interact conversationally using the personality information of the specific person. This is used to enable the chatbot to converse in the theme or character of that specific person’s personality. For example, conversation can include verbal conversation drawing from audio samples of the specific person’s voice in addition to text communication.”
He adds that a 2D or 3D model may also be created of that specific person using images and/or video data associated with that person, where the data in the personality index relating to that specific person is insufficient to enable the chatbot to respond to a certain human user’s input.
“With this, the chatbot can formulate and ask questions of the human user to deal with the missing data,” he says. “The chatbot could also be programmed to draw from other data related to other people deemed to be ‘similar’ to the specific person, or more generic databases which may contain data from a broader class of people.”
What are its implications?
Legal implications will mainly depend on the national law of the place where the chatbot is being used, says Cobden. “The chatbot operator will need to check if the chatbot is using copyright material. For example, there could be copyright in the social data being accessed and processed by the chatbot. Any text or recorded media created by or about the specific person may attract copyright. Copyright usually subsists for a period of time – perhaps 50 or 70 years – after the death of the author, so social data of deceased people may also be covered. Subject to any fair use or fair dealing exceptions or licences provided by the terms of the social media platform, downloading and processing the social data without the permission of the author may infringe copyright in the data.”
He adds: “In addition, there could be copyright in any photos, videos and sound recordings of the specified person which the chatbot uses to formulate conversational responses.”
Cobden notes that in some countries, personality or image rights are protected so, for example, a celebrity may be able to object to their personality being used by a chatbot.
In Hong Kong and other countries with laws based on English law, a celebrity may be able to rely on the law of passing-off, which protects goodwill, to argue that the chatbot operator is misrepresenting that the celebrity has endorsed the chatbot. If the specified person is not a celebrity, this action may not be available to them.
“After death, passing-off is still available to the estate of the dead person,” says Cobden. “However, a passing off action may fail if the celebrity has been dead for some time for the reason that his or her goodwill has dissipated over time. In some cases, celebrities have registered trademarks for their names and, in those cases, the chatbot operator may need to have a licence from the trade mark owner to use his or her name in particular situations.”
In other jurisdictions, such as the United States, there are specific personality rights that protect a person’s image and likeness from being commercially exploited without permission. Some states in the U.S. also recognize a post-mortem personality right. In China, a person’s image and name are protected by the Chinese Civil Code and it may also be illegal to use a deceased person’s name for commercial purposes without permission.
The challenges with this new technology
For Nanki Arneja, a senior associate at Chadha & Chadha in New Delhi, while the subject patent in its broadest sense may appear to be legally tenable under the patent regime of some countries, the possible use of “reincarnation chatbots” presents a variety of challenges, both legal and moral.
“The most important question of consent for use of personal data remains a matter of conjecture once a person has been put to rest,” she says. “Further, with the introduction of this technology, several issues pertaining to who shall possess the rights to reboot the digital persona of the deceased also need to be addressed. Moreover, the post-mortem rights of a person as recognized in several countries may also pose integral challenges to the implementation of the said technology. Another issue raised by legal experts pertains to the perspective of seeking profits from the dead. In case the subject technology monetarily charges a person to digitally revive their deceased loved one, it poses moral issues pertaining to commercializing the agony of individuals.
She adds that another issue of significant conjecture pertains to the scenario in which an emulating chatbot doesn’t have enough data to provide an answer on a specific topic, crowd-sourced conversational data stores may be used in conversations, which may contradict the actual personality of the deceased.
“Thus, the inherent nature of the said technology may not provide individuals with authentic conversations with the departed,” she says.
Ankita Sabharwal, an associate at Chadha & Chadha, says that with several data-specific legislations such as the General Data Protection Regulation (GDPR) recognizing the “right to opt out” as an integral right of data protection, it is unclear whether the loved ones of the departed shall be able to prevent them from being converted into chat bots.
“Several experts have also raised questions pertaining to the psychological effect this technology may have on grieving individuals,” she says. “Another significant issue pertains to whether these AI-powered chatbots can be equipped with elements of emotional intelligence. Since these bots are aimed at emulating individuals, both alive and dead, it is integral that these autonomous robots be sensitive to specific changes in environmental factors as well as factors of the surrounding environment. Thus, it is still unclear whether specific changes in the surrounding environment trigger programmed reactions of advanced artificial emotional intelligence in these chat bots, which remains integral to their functioning.”
Ranjan Narula, managing partner at RNA, Technology and IP Attorneys in Gurugram, adds that when “borrowing” a person’s personality, the patent infringement claim will assess whether a third party has, without authorization, copied the technology, and not the underlying data that would help the technology to deliver the final result.
“The use of data and other personality traits of a person without permission would raise privacy issues and personality rights claims,” he says.
Moreover, he adds that one of the major challenges would revolve around ethical issues as the technology controls virtual existence of a person and somewhat goes against the law of nature by creating a virtual personality of a dead person.
“Collection of data by mining the online social media platforms and other news reports and writings raises privacy issues,” he says. “Building a prototype based on the technology is anticipated to face regulatory hurdles and, once launched in the market, there is also a risk of illegal activities and data privacy issues. To deal with privacy issues, a user’s permission for the creation of a chatbot for their profile would be necessary.”
The future of chatbots
“In the case of the plans of that company, the patent does indicate that the possibilities for artificial intelligence have advanced from robots to creating virtual and interactive models of real people,” says Arneja. “In case these chatbots are eventually used to immortalize the dead, it shall be path-breaking in the domain of AI development.”
For Narula, this is still a disruptive technology. “This is disruptive technology which would require a number of legal and ethical hurdles to be crossed before this can be rolled out. The news reports suggest that they do not have plans to build products based on the technology.”
Excel V. Dyquiangco