The news:Child safety concerns are mounting as several platforms face heightened scrutiny over lacking moderation capabilities.
Google settled a lawsuit on Tuesday over claims that it violated children’s privacy through YouTube by collecting personal data for targeted ads without parental consent, though the company denied wrongdoing in its decision to settle.
A Missouri senator began a child safety investigation of Meta’s AI bot last week after a Reuters report claimed Meta allowed its AI chatbots to have “romantic” and “sensual” conversations with minors.
Minnesota joined other states Tuesday in suing TikTok, alleging it exploits children with addictive algorithms. Attorney general Keith Ellison said TikTok knew the risks but failed to act, violating consumer fraud and deceptive trade laws.
Roblox, too. The company was sued by Louisiana last week over claims that the platform enables sexual exploitation of children and the distribution of child sexual abuse material—a recurring critique of the platform.
And controversial chatbot Character AI is in talks for a potential sale after mounting legal and regulatory scrutiny, including a lawsuit alleging the platform was responsible for the death of a 14-year-old boy who took his life after becoming attached to a chatbot.
You've read 0 of 2 free articles this month.
Create an account for uninterrupted access to select articles.