Writtle University College and ARU have merged. Writtle’s full range of college, degree, postgraduate and short courses will still be delivered on the Writtle campus. See our guide to finding Writtle information on this site.

Experts unite for innovative online safety tool

Published: 23 February 2023 at 08:20

Hands typing on a keyboard

Machine learning will help detect and block child sexual abuse images in real-time

Academics at the Policing Institute for the Eastern Region (PIER), part of Anglia Ruskin University (ARU), have joined a project team which is set to launch ground-breaking technology, designed to find and block images of child sexual abuse on personal devices.

Launching in March, the two-year Protech project will research, design and create an app that can be installed on the devices of individuals at risk of accessing child sexual abuse material.

The app’s uniqueness lies in its user-centred design which employs highly accurate machine learning models to provide effective intervention to individuals who fear they might offend against children. It will work in real-time to detect and halt the viewing of criminal content before it is seen by the user.

It could prove to be a vital tool for the sustainable, long-term prevention of child sexual abuse content, alongside current digital activities that tackle and remove the imagery, such as criminal investigations and the removal and hashing of images. 

The team at PIER will work alongside researchers at the Department of Developmental Psychology at Tilburg University in the Netherlands, to help design the new safety tech tool, by investigating why and how offenders begin viewing sexual images of children and what could help them to stop.

The new tool will use machine learning in real-time to detect child sexual abuse images and videos, and is being developed by a collaboration of EU and UK experts. The app, named Salus after the Roman goddess of safety and wellbeing, is to be created by UK technology company SafeToNet which specialises in cyber safety, using innovative real-time monitoring technology.

The Internet Watch Foundation (IWF), Europe’s largest hotline dedicated to finding and removing images and videos of child sexual abuse from the internet, will provide a secure environment to train and test the app’s machine-learning software to correctly detect child sexual abuse material.

The app will be deployed voluntarily, and users will have full knowledge of its purpose and its effect on their device.

The safety app will monitor both network traffic and images viewed on the user’s screen in real-time. After being installed, the app will run silently and will not require user interaction unless sexual images of children are detected and blocked. 

Collaborators behind the €2m (£1.8m) project, which is funded by the European Commission, believe the tool could help stem the growing demand for child sexual abuse material online.

It will prevent the revictimisation of child sexual abuse survivors who continue to suffer in the knowledge that others may still be able to view images and videos of them online.

Director of the Policing Institute for the Eastern Region, Prof Sam Lundrigan said:

“The online abuse of children is a global challenge that needs innovative thinking in our combined efforts to respond.

“We know that academic findings through research such as ours, can provide the data needed to support ground-breaking projects like this, with informed insight and evidence. This is an exciting project that we’re delighted to support, and one we hope will have a real impact, both on those at risk of offending and those who have already suffered abuse.”

The project is led by one of the largest university hospitals in Europe, Charité – Universitätsmedizin Berlin (CUB), in partnership with experts from diverse and wide-ranging fields including criminology, public health, developmental, clinical and forensic psychology, software engineering, child protection and internet safety.

Interviews will be conducted with individuals at risk of viewing sexual images of children as well as with professionals at prevention support level.

Participants in the research study, led by PIER and colleagues, will be volunteers, recruited by the project team partners who provide critical community prevention services – CUB; the UK’s Lucy Faithfull Foundation; Stop it Now Netherlands which is part of the Centre for Expertise on Online Child Sexual Abuse; and the University Forensic Centre within the University Hospital Antwerp in Belgium.

Once designed, the safety intervention will be rolled out in a pilot stage in five countries –Germany, Netherlands, Belgium, Republic of Ireland and the UK – involving more than 50 professionals and at least 180 users over an 11-month period.

SafeToNet will gather feedback from users and professionals while the pilot is ongoing and use it to further improve and adapt the app’s software.

Part of the project will involve evaluating and assessing the potential reach and impact of the intervention in Europe, taking on board recommendations from experts on how it could be effectively implemented as part of public health prevention programmes.

Director of the Institute of Sexology and Sexual Medicine at CUB, Prof Dr Klaus M Beier said:

“The increasing consumption and distribution of child sexual exploitation material is a problem of international significance and necessitates research into user behavior, particularly in cases not known to the legal authorities, which far outnumber those under juridical inquiry or after conviction. This has been largely neglected in the past, despite being where the potential for prevention is greatest. 

“Thus, with the development of Salus, Protech also targets self-motivated and cooperative, potential or real users of child sexual abuse images who want to avoid starting or continuing consumption.”

Given the sheer scale of sexual images of children available online and the growing demand for the content, the project team believe the app and the intervention programme behind it will also help reduce the workload of law enforcement pursuing the criminals responsible for creating, distributing and, in some cases, profiting from the sale of the content.