Police and EE join forces to advise on deepfakes

A boy in a blue blazer and striped tie has a yellow helmet on and a rucksack, and he is looking down at a phone. He is sticking his tongue out to one side of his mouth. He is outside.Image source, Essex Police
Image caption,

In the video, a young actor talks through the benefits, but also the dangers, of artificial intelligence

  • Published

A police force has said it is the first to work with phone companies to educate young people and their families about the dangers of artificial intelligence (AI).

Essex Police has launched an awareness video with EE and its parent company BT and will offer AI safety advice at its stores in the county.

AI has been used to manipulate videos, images and audio but still appear real - known as deepfakes - for the purposes of online sexual abuse and spreading disinformation.

Det Insp Emma Portfleet said while AI apps could be used for "positive reasons, they also have the potential for immeasurable harm".

In the video, a young actor talks through the advantages of AI, external, describing it as "like magic".

The boy later appears filming an older man using a walking frame on a High Street, and he uses the technology to adapt the footage, making it appear as if the man is pirouetting like a ballet dancer.

He tells the audience that AI can be used "to spread lies and invade your privacy".

Essex Police says it deals with the results of AI videos daily.

A police custody photo of Brandon Tyler. He has dark hair with a slightly floppy fringe and a short-cropped beard. He is wearing a black hoodie and looking into the camera.Image source, Essex Police
Image caption,

A judge said Brandon Tyler showed the "worst kind of toxic masculinity"

In April, 26-year-old Brandon Taylor, a barman from Braintree, was jailed for five years for creating sexually explicit images depicting real women.

He took photos from their social media accounts and used AI to manipulate them before sharing them on websites, some of which glorified rape.

The sharing of sexually explicit deepfake images is a criminal offence under the Online Safety Act of 2023, and this year the government announced the law would be toughened., external

This is a snapshot taken from the deepfake video. A man who appears to be George Freeman is sitting in front of a window, talking directly to the camera. He is wearing a shirt and tie, and glasses. Above him is the Reform Party logo and the words "Reform UK - Make Britain Great"Image source, Facebook
Image caption,

MP George Freeman, who appeared in a fake political video, has called for action to tackle what he says is identity theft

The problems caused by deepfakes are varied and widespread.

The Conservative MP for Mid Norfolk, George Freeman, reported a deepfake to police in October.

The bogus video, which looked and sounded like the MP, appeared to show him announcing he was defecting to Reform UK.

Freeman is calling for changes in the law to protect victims.

Det Insp Portfleet, who leads the Essex Police online investigation team, said the online and real worlds "are merging so fast and it's so hard for people to know what's fake and what's real".

"The campaign is just one way to get ahead of the problem; we will always investigate crime, but we would far rather stop it happening in the first place," she said.

There are three images from the campaign video. On the left, a boy wearing school uniform is sitting in a classroom. There are multi-national flags above his head and a model of the Eiffel Tower on his desk. He is holding a mobile phone and explaining that AI can be used for translation. The middle image shows the Essex Police crest alongside the EE Phonesmart logo on a teal coloured background. The third image, shows a boy wheeling a bicycle along a street. He is looking at his mobile and the words "fake news!" are above his head in cartoon or comic style yellow writing.Image source, Essex Police
Image caption,

The educational video warns against "fake news" and of AI being used to invade privacy

From February, EE mobile phone stores in Essex will offer dedicated appointments to families who want to learn more about AI safety.

"We understand that growing up in an online world can be difficult," said EE retail director Asif Aziz.

"We hope to help young people and their parents better navigate the online world with confidence and positivity."

The Cambridge-based Internet Watch Foundation, external is responsible for finding, removing and blocking online images of child sexual abuse, including those generated with AI.

Chief executive Kerry Smith welcomed the campaign.

"The harm is real, and children feel the same shame and guilt as if it was a real photo," she said.

She said children could use the IWF and Childline Report Remove, external tool to confidentially report sexual imagery of themselves that is online.

"We can take steps to get it removed as swiftly as possible."

Get in touch

Do you have a story suggestion for Essex?

Related internet links