Charity finds more than 500,000 child abuse victims

Glass-fronted office block with IWF and a globe sign written in the top-left corner, around the office are other buildings and trees and bushes.
Image caption,

A floor of the IWF offices in Histon is restricted to staff members who deal with imagery

  • Published

An analyst who removes child sexual abuse content from the internet says she is always trying to stay "one step ahead" of the "bad guys".

Mabel, who uses a pseudonym to protect her identity, works for the Internet Watch Foundation (IWF), a charity based in Histon, Cambridgeshire.

The IWF has a team of frontline staff who identify and remove online child sexual abuse imagery from across the world.

It has identified more than half a million victims this year, but says advances in technology are making the charity's work even more challenging.

"The bad guys always seem to be one step ahead," said Mabel, a mother and grandmother who became an analyst four years ago.

"You're always trying to uncover what they're doing next to get ahead of them."

'I can protect'

Mabel eliminates material after receiving reports from members of the public or actively searching for content.

She said the way criminals were hiding material was "evolving all the time".

Mabel said: "You might need to watch a video for instance to find a password to unlock another video somewhere else."

The IWF has a technology team on-site who are constantly looking at new software to help crack codes on the web.

Mabel, a former police officer, said the work was difficult but that it also gave her a sense of purpose.

"The thought that I can protect my younger grandchildren from seeing this stuff, or maybe even being lured down that road, it's a huge sense of pride."

Image caption,

Technology officer Dan Sexton said advances in AI were only going to make their work harder

Staff at the IWF who look at imagery go to compulsory counselling sessions once a month, and more if requested, and are given regular breaks and down-time while on shift.

No one is allowed access to emails unless they are in the office, which is closely guarded – one floor of the building is inaccessible except for authorised members of staff.

Dan Sexton, the IWF's chief technology officer, said advances in artificial intelligence (AI) were "only going to make our work harder".

He said new technologies were constantly emerging, and being abused by criminals.

"With generative AI there's the capability for people creating effectively an infinite amount of new child sexual abuse (CSA) content," said Mr Sexton.

The organisation has recorded 563,590 child victims in images alone this year, with most of them being girls and between the ages of seven and 10.

Mr Sexton said those figures might seem shocking to some, but did not come as a surprise to him.

"Just this year we have more than 2.7m unique images of CSA that we've come across and that is only growing," he said.

He said every image being removed was thanks to "really incredibly hard-working staff" like Mabel and the other frontline workers.

If you have been affected by these issues, the NHS has information on where to find support, external. Anyone with concerns about a child is advised to contact the NSPCC helpline, external.

Get in touch

Do you have a story suggestion for Cambridgeshire?

Follow Cambridgeshire news on BBC Sounds, Facebook, external, Instagram, external and X, external.