The news: Character AI is fighting a lawsuit by arguing that its platform and chatbot are protected by the First Amendment of the US Constitution.
The AI startup filed a motion to dismiss a lawsuit that claims the company is responsible for the death of a 14-year-old boy who took his own life after becoming emotionally attached to a chatbot.
Character AI argued that the First Amendment protects its chatbot platform similarly to protections for music, movies, television, and video games.
You've read 0 of 2 free articles this month.
Create an account for uninterrupted access to select articles.