In an attempt to better remove child sexual abuse material, Google has unveiled its new Content Saftey API. Designed as a developer's toolkit, this new AI-based program uses deep neural networks to process images in a way that ensures fewer viewers will be exposed or able to access child sexual abuse material. Like other tools of its kind, the Content Saftey API uses image matching technology, but this new toolkit does so more accurately and can help reviewers identify 700 percent more child abuse content.
The Internet Watch Foundation, an organization focused on minimizing the availability of child sex abuse images online, has already applauded the toolkit and claims it will make the internet safer. According to Susie Hargreaves, CEO of the UK-based organization, "We, and in particular our expert analysts, are excited about the development of an artificial intelligence tool which could help our human experts."
Image Credit: Shutterstock