Visual content moderation software company, Image Analyzer, has been selected to receive a share of the UK Government’s Safety Tech Challenge Fund to find new ways to detect Child Sexual Abuse Material (CSAM).
November 19, 2021 —
AI-powered visual content moderation software company, Image Analyzer, has been selected for the UK Government’s Safety Tech Challenge Fund, granting the company a share of money to develop new ways of finding Child Sexual Abuse Material sent via encrypted channels, without compromising citizen’s privacy. In a partnership with Galaxkey - a content-encryption technology provider - and Yoti - an age verification technology company - Image Analyzer will develop visual content analysis technology empowered by AI. It is intended to work within existing messaging services that use end-to-end encryption.
A first-of-a-kind technology pilot
News of this collaboration has been met with much positivity by the Image Analyzer team. The CEO, Cris Pikes, stated that “We are delighted to be collaborating with Galaxkey and Yoti to deliver this exciting, first-of-a-kind technology pilot that recognises the importance of protecting users’ data and privacy whilst addressing the inherent risks to children associated with end-to-end encryption.”
He went on to add that “as a ground-breaking technology collaboration, the Galaxkey, Yoti and Image Analyzer solution will enable users to access all of the benefits related to encryption whilst enabling clean data streams and offering reassurance within specific use case scenarios such as educational sharing.”
Solving the E2EE problem
The NSPCC has been a prolific commentator on the dangers associated with end-to-end encryption and CSAM on various platforms. Messenger apps - like Whatsapp - already have E2EE in place, and Facebook announced plans to introduce it to both Messenger and Instagram at some point. While seen by tech companies as a way of protecting users’ data, it makes it harder for law enforcement agencies to find evidence in child grooming cases. Because images and other data are protected by E2EE, it can be hard to generate convictions for those sharing illegal images.
This is where the Safety Tech Challenge Fund was introduced by the UK Government as a solution to this problem. It has awarded five companies up to £85,000 to prototype and create new ways of detecting and addressing CSAM within E2EE environments. Image Analyzer is one of the recipients of this grant and has until March 2022 to deliver proofs of concept.
About Image Analyzer
Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by employees or users abusing their digital platform access to share harmful visual material. The company focuses on detecting visual threats and is a member of the Online Safety Tech Industry Association. The unique technology developed by Image Analyzer can detect visual risks in milliseconds, including illegal content, images and videos that are deemed harmful to users - specifically children and vulnerable adults.
For more information, view the website here: https://www.image-analyzer.com/.
Press Contact Details:
Jack Manley
Freestyle Digital, Oculis House, South Hampshire Industrial Park, Totton, Southampton, SO40 3SA
02380 000 212
office@freestyle.digital
Contact Info:
Name: Jack Manley
Email: Send Email
Organization: Image Analyzer
Address: Freestyle Digital, Oculis House, South Hampshire Industrial Park, Totton, Southampton, SO40 3SA
Phone: 02380 000 212
Website: https://www.image-analyzer.com/
Source:
Release ID: 89053870