Online grooming: 'My modelling job turned out to be a gang rape trap'
- Published
"I thought I was going to a modelling job, but instead I was lured into a gang rape trap."
Kelly, not her real name, was 17 when she was approached on Instagram by a woman who posed as a modelling scout.
She was invited to meet for a video shoot in central London, but when she arrived, a man she had never met was waiting for her.
"He told me he was the manager and took me to an upstairs apartment where he forced himself on me," she said.
"The modelling scout I had been talking to online then came in half an hour after with condoms. It was clear that it all had been planned out between them."
Kelly's story comes after 1,200 online grooming crimes were recorded by police across the UK during the April to June lockdown period.
About 100 of those were recorded in London while Instagram, owned by Facebook, was noted as the platform of choice for grooming, continuing a trend highlighted by the NSPCC in 2019.
'Locked myself in the toilet'
"I heard the lady on the phone to other men saying 'she's here'. It was obvious that they had planned for other men to come to gang rape me," Kelly said.
Kelly, thinking on her toes, told her attacker she was on her period and needed to visit the pharmacy to get tampons.
"I managed to convince him to let me go to the bathroom in a nearby cafe where I explained to a staff member what was happening. She told me to lock myself in the toilet while she called the police."
The Internet Watch Foundation (IWF) has said it and its partners blocked at least 8.8 million attempts by UK internet users to access videos and images of children suffering sexual abuse during lockdown.
The National Society for the Prevention of Cruelty to Children (NSPCC) warned the pandemic had created a perfect storm for online offenders and believed these figures could mark the start of a surge in online grooming crimes.
A Facebook spokesperson said: "We don't allow grooming on our platforms and we work closely with child protection experts and law enforcement to keep people safe.
"Using industry-leading technology, 96% of the child exploitation content we remove is taken down before it's reported to us."
But Mared Parry, another victim of grooming, said she thought more should be done to protect children online.
Mared, from Wales, but now living in London, was 14 when she was groomed online by men who manipulated her into sending sexual images of herself.
"When I was 14, I was just living my life in school as you do and I started to talk to a few older guys on Facebook.
"One thing led to another and it got a lot more sinister without me noticing. They would ask for pictures or for me to talk dirty with them.
"They would tell me that's what mature girls their age would do and I just wanted to match up to that."
A decade later, she has been campaigning for better tools to help report sexual harassment on Instagram and other social media platforms.
"There's no box for sexual harassment in the complaint section on Instagram. There are boxes for nudity, hate speech, and bullying etc.
"But there is not a box to click if a person is sexually harassing you or doing dirty underage sex stuff. There's never a box for that."
A Facebook spokesperson said: "We have a content and security team of over 35,000 people investigating reports from our community and working to keep our platforms safe.
"Our teams also work closely with child protection experts and law enforcement, reporting content directly to specialists."
The NSPCC has called for the government to push ahead with the proposed Online Harms Bill within 18 months.
It could see internet companies fined if they failed to tackle the spread of harmful behaviour of various types on their sites.
NSPCC chief executive Peter Wanless said: "Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.
"Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm."
- Published8 April 2019
- Published29 May 2020
- Published16 July 2020
- Published1 March 2019
- Published10 June 2020