Google and Character.AI are working to resolve the case of the death of a teenager involving a chatbot. Both are known to be negotiating with the victim's family to discuss the terms of resolving the case.
This case came up because the chatbot from Character.AI allegedly caused a minor to commit suicide. This is due to users being encouraged to hurt themselves. One of the most talked-about cases is the death of Sewell Setzer III, a 14-year-old child.
The child is known to have had a sexual conversation with a bot named Daenerys Targaryen before committing suicide. Another lawsuit came from the family of the victim whose son was 17 years old.
In fact, the action is said to be reasonable because his parents have limited screen time. Megan Garcia, Setzer III's mother, stated that Character.AI must also be held responsible for her son's death. The reason is that the technology developed has endangered the lives of children.
"(They must) be held legally accountable when they knowingly design harmful AI technology that kills children," Garcia said, citing TechCrunch on Thursday, January 8.
It is not yet known what requirements Google and Character.AI have submitted to the victim's family. However, they have discussed a detailed agreement in the form of actions that the company will take and the value of compensation that the company must pay to the victim's family.
Character.AI has also taken quick action by restricting access to its chatbot for minors since last October. Information regarding the details of the settlement will be revealed by VOI after there is a statement or leak of the file from the court.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)