A new child safety group was founded by Discord, Google, OpenAI, and Roblox in a combined effort to combat online child exploitation amid the rapid rise of AI technology.
The non-profit organization is known as the Robust Open Online Safety Tools (ROOST) initiative, which aims to make core safety technologies much more accessible for companies. It will also work to provide free, open-source AI tools that can identify, review, and report child sexual abuse material found online.
Motivation for the Latest Effort
The companies were motivated to start the initiative partially because of the changes that generative AI advancements have made to online environments. It seeks to address the dire need to accelerate innovation in online child safety.
Limited information is available about the details of the CSAM detection tools other than the fact that they will use large language AI models and "unify" existing options for dealing with the content they will target.
Founding ROOST partner and former Google CEO Eric Schmidt noted that ROOST's collaborative, open-source approach will foster innovation and make essential infrastructure more transparent, accessible, and inclusive. The initiative's main goal is to create a safer online environment for every person, according to The Verge.
The announcement of the child safety group comes amid an ongoing regulatory battle related to child safety on social media and online platforms. Many companies are seeking to appease lawmakers through self-regulation.
Increase in Child Exploitation Incidents
The situation comes as the National Center for Missing and Exploited Children (NCMEC) reports that suspected child exploitation has risen by 12% between 2022 and 2023. More than half of U.S. children were on Roblox as of 2022. The company has received repeated criticism for failing to address child sexual exploitation and exposure to inappropriate content on its platform.
Child online safety has long been a major issue since the Children and Teen's Online Privacy Protection Act (COPPA) and Kids Online Safety Act (KOSA) began making their way through Congress. This is even though both of these legislations failed to pass in the House, Engadget reported.
Another aspect of the ROOST initiative is that it has raised $27 million in funding for its first four years of operation. The Patrick J. McGovern Foundation, the Knight Foundation, and the AI Collaborative also support the child safety group.
The initiative will primarily operate out of the Institute of Global Politics at Columbia University. It will bring together experts in various fields, including AI technology, cybersecurity, and child safety, as per Maginative.