AI-generated materials featuring child sex abuse have now reached a "tipping point" as it becomes easier to create these kinds of images, a watchdog warned.
The Internet Watch Foundation (IWF), a UK-based watchdog aiming to stop child sex abuse online, said there had been a significant increase in the amount of AI-generated illicit content it had seen on the open web over the past six months. This means the imagery can now be accessed by the general public without having to use specialized browsers.
"Recent months show that this problem is not going away and is in fact getting worse," IWF's interim chief executive Derek Ray-Hill said, as quoted by The Guardian. He also said the situation is now at its "tipping point" where authorities are unsure whether the materials are only generated by AI or are actual photos of children needing help.
It is important to note that AI images are trained using existing sexual abuse images. This allows the tool to generate realistic images.
How Are Child Sex Abuse Images Getting on the Open Web
The IWF said eight out of 10 reports of AI-generated imagery came from members of the general public who found the illicit content on public sites. This includes forums and AI galleries.
Additionally, the watchdog said more than half of these images were hosted on servers in Russia and the United States. A large amount of AI-generated child sex abuse imagery were also found on servers in Japan and the Netherlands.
What Is the Situation in the US
In the United States, federal prosecutors are stepping up in their pursuit of suspects using AI to generate illicit materials, including images depicting child sex abuse. So far this year, the Department of Justice has brought two criminal cases against individuals accused of creating child sex abuse materials using generative AI. The department further noted that there's "more [cases] to come," per Reuters.
The Justice Department's statement comes after authorities in Texas arrested 55-year-old Daril Martin Gonzalez, who worked as a janitor for Anson ISD, according to KATU 2. Officials said Gonzales, who also moonlighted as a school sports and cheerleading photographer, used the photos of middle and high school students to generate sexually explicit videos.