March 29, 2024
Spread the love

Google has recently launched a new AI tool which will help the companies to identify child abuse material. Using the company’s expertise in machine vision, it assists moderators by sorting flagged images and videos and “prioritizing the most likely CSAM content for review.” This should allow for a much quicker reviewing process. In one trial, says Google, the AI tool helped a moderator “take action on 700 percent more CSAM content over the same time period.

Fred Langford, deputy CEO of the Internet Watch Foundation (IWF), said the software would “help teams like our own deploy our limited resources much more effectively.” “At the moment we just use purely humans to go through content and say, ‘yes,’ ‘no,” says Langford. “This will help with triaging.”

The IWF is one of the largest organizations dedicated to stopping the spread of CSAM online. It’s based in the UK but funded by contributions from big international tech companies, including Google. It employs teams of human moderators to identify abuse imagery, and operates tip-lines in more than a dozen countries for internet users to report suspect material. It also carries out its own investigative operations; identifying sites where CSAM is shared and working with law enforcement to shut them down.

Langford says that because of the nature of “fantastical claims made about AI,” the IWF will be testing out Google’s new AI tool thoroughly to see how it performs and fits with moderators’ workflow. He added that tools like this were a step towards fully automated systems that can identify previously unseen material without human interaction at all. “That sort of classifier is a bit like the Holy Grail in our arena.”

But, he added, such tools should only be trusted with “clear cut” cases to avoid letting abusive material slip through the net. “A few years ago I would have said that sort of classifier was five, six years away,” says Langford. “But now I think we’re only one or two years away from creating something that is fully automated in some cases.”

Leave a Reply

Your email address will not be published. Required fields are marked *