The challenges with this new technology
For Nanki Arneja, a senior associate at Chadha & Chadha in New Delhi, while the subject patent in its broadest sense may appear to be legally tenable under the patent regime of some countries, the possible use of “reincarnation chatbots” presents a variety of challenges, both legal and moral.
“The most important question of consent for use of personal data remains a matter of conjecture once a person has been put to rest,” she says. “Further, with the introduction of this technology, several issues pertaining to who shall possess the rights to reboot the digital persona of the deceased also need to be addressed. Moreover, the post-mortem rights of a person as recognized in several countries may also pose integral challenges to the implementation of the said technology. Another issue raised by legal experts pertains to the perspective of seeking profits from the dead. In case the subject technology monetarily charges a person to digitally revive their deceased loved one, it poses moral issues pertaining to commercializing the agony of individuals.
She adds that another issue of significant conjecture pertains to the scenario in which an emulating chatbot doesn’t have enough data to provide an answer on a specific topic, crowd-sourced conversational data stores may be used in conversations, which may contradict the actual personality of the deceased.
“Thus, the inherent nature of the said technology may not provide individuals with authentic conversations with the departed,” she says.
Ankita Sabharwal, an associate at Chadha & Chadha, says that with several data-specific legislations such as the General Data Protection Regulation (GDPR) recognizing the “right to opt out” as an integral right of data protection, it is unclear whether the loved ones of the departed shall be able to prevent them from being converted into chat bots.
“Several experts have also raised questions pertaining to the psychological effect this technology may have on grieving individuals,” she says. “Another significant issue pertains to whether these AI-powered chatbots can be equipped with elements of emotional intelligence. Since these bots are aimed at emulating individuals, both alive and dead, it is integral that these autonomous robots be sensitive to specific changes in environmental factors as well as factors of the surrounding environment. Thus, it is still unclear whether specific changes in the surrounding environment trigger programmed reactions of advanced artificial emotional intelligence in these chat bots, which remains integral to their functioning.”
Ranjan Narula, managing partner at RNA, Technology and IP Attorneys in Gurugram, adds that when “borrowing” a person’s personality, the patent infringement claim will assess whether a third party has, without authorization, copied the technology, and not the underlying data that would help the technology to deliver the final result.
“The use of data and other personality traits of a person without permission would raise privacy issues and personality rights claims,” he says.
Moreover, he adds that one of the major challenges would revolve around ethical issues as the technology controls virtual existence of a person and somewhat goes against the law of nature by creating a virtual personality of a dead person.
“Collection of data by mining the online social media platforms and other news reports and writings raises privacy issues,” he says. “Building a prototype based on the technology is anticipated to face regulatory hurdles and, once launched in the market, there is also a risk of illegal activities and data privacy issues. To deal with privacy issues, a user’s permission for the creation of a chatbot for their profile would be necessary.”