Woman's deepfake betrayal by close friend: 'Every moment turned into porn'
![A blonde woman with a nose ring is staring at the camera. She is wearing a black and white striped t-shirt and is inside a non-descript building](https://ichef.bbci.co.uk/ace/standard/3840/cpsprodpb/3af5/live/daafdab0-e5f9-11ef-a819-277e390a7a08.jpg)
Australian woman Hannah Grundy was shocked to discover a website containing deepfake images of her
- Published
Warning: Contains offensive language and descriptions of sexual violence
It was a warm February night when an ominous message popped into Hannah Grundy's inbox in Sydney.
"I will just keep emailing because I think this is worthy of your attention," the anonymous sender wrote.
Inside was a link, and a warning in bold: "[This] contains disturbing material."
She hesitated for a moment, fearing it was a scam.
The reality was so much worse. The link contained pages and pages of fake pornography featuring Hannah, alongside detailed rape fantasies and violent threats.
"You're tied up in them," she recalls. "You look afraid. You've got tears in your eyes. You're in a cage."
Written in kitschy word art on some images was Hannah's full name. Her Instagram handle was posted, as was the suburb she lived in. She would later learn her phone number had also been given out.
That email kicked off a saga Hannah likens to a movie. She was left to become her own detective, uncovering a sickening betrayal by someone close to her, and building a case which changed her life - and Australian legal standards.
'Pure shock'
The web page was called "The Destruction of Hannah", and at the top of it was a poll where hundreds of people had voted on the vicious ways they wanted to abuse her.
Below was a thread of more than 600 vile photos, with Hannah's face stitched on to them. Buried in between them were chilling threats.
"I'm closing in on this slut," the main poster said.
"I want to hide in her house and wait until she is alone, grab her from behind and... feel her struggle."
It's been three years now, but the 35-year-old school teacher has no trouble recalling the "pure shock" that coursed through when she and partner Kris Ventura, 33, opened the page.
"You immediately feel unsafe," Hannah tells me, eyes wide as she grips a mug of peppermint tea in her living room.
Clicking through the website Kris had also found photos of their close friends, along with images depicting at least 60 other women, many also from Sydney.
The couple quickly realised the pictures used to create the deepfakes were from the women's private social media accounts. And the penny dropped: this was someone they all knew.
Desperate to find out who, Hannah and Kris spent hours at the kitchen table, identifying the women, searching their social media friends lists for a common link, and methodically building a dossier of evidence.
Within four hours, they had a list of three potential suspects.
On it, but immediately discounted, was their close friend from university Andrew Hayler. The trio had met while working at a campus bar, and the staff there quickly formed deep friendships.
And Andy, as they called him - the supervisor - was the glue of the group.
![Four people - a woman, a man, another woman and another man - stand in a row outside a building. The second woman's face has been blurred out. They are are smiling and have their arms round each other](https://ichef.bbci.co.uk/ace/standard/976/cpsprodpb/d6b1/live/1dff9ad0-e5fa-11ef-a319-fb4e7360c4ec.jpg)
Hannah, Kris and Andy (right) met while working at Sydney University's campus bar
He was considerate and affable, Hannah says - the kind of guy who looked out for women in the bar and made sure his female friends got home safely after a night out.
They all hung out regularly, went on holidays together, loved and trusted each other.
"I thought of him as a very close friend," Hannah says.
"We were just so sure that he was a good person."
But soon they'd whittled down the list to just one name: his.
Fear and delays
When Hannah woke the next morning and went to the police station, mingling with her shock and horror was a "naive" sliver of optimism.
"We thought they'd go grab him that afternoon," Kris says with a wry smile.
Instead, Hannah says she was met with disdain.
She recalls one New South Wales Police officer asking what she'd done to Andy. At one stage they suggested Hannah simply ask him to stop. Later, they pointed to a picture of her in a skimpy outfit and said "you look cute in this one", she says.
New South Wales Police declined to comment to the BBC on the specifics of Hannah's case.
But she says the way her complaint was handled made her feel like she was making "a big deal out of nothing".
"And for me, it felt quite life-changing," says Hannah.
Any faith she still held that police would help quickly dwindled.
Amid delays, she turned to Australia's eSafety Commissioner, but under its powers as a regulatory body it could only offer help in taking the content down.
![A digitally edited photo of Hannah Grundy, which has been blurred, shows her sitting on a couch naked, with chains around her wrists and neck, being pulled by a hand](https://ichef.bbci.co.uk/ace/standard/641/cpsprodpb/3a8a/live/583ef380-e5fa-11ef-a319-fb4e7360c4ec.jpg)
Andy had digitally edited and posted hundreds of photos of Hannah
Desperate, the couple hired a lawyer and commissioned a digital forensics analyst to move things along.
In the meantime, to avoid tipping Andy off and to keep themselves safe, they retreated inwards.
"The world for you just gets smaller. You don't speak to people. You don't really go out," Hannah says.
Intense fear and loneliness filled the void instead.
"We'd already had to suspend complete belief to understand that he'd done these things, so [the idea of] him actually coming to try and rape you or hurt you isn't that much of a bigger stretch."
The couple installed cameras all around their house and set up location tracking on Hannah's devices. She began wearing a health watch 24/7, so someone would know if her heartbeat rose - or ceased.
"I stopped having the windows open because I was scared... maybe someone would come in," Hannah explains.
"We slept with a knife in both of our bedside tables because we just thought: 'What if?'"
Still feeling abandoned by police, Kris had taken on the burden of monitoring the site for the slightest sign of escalation towards Hannah and any of their friends - who, to protect the investigation, still did not know anything.
Guilt ate at the pair: "We had a constant battle about whether it was right to not tell them," Hannah says.
At one point told the investigation had been suspended, Hannah and Kris forked out even more money for a detailed forensic report, and threatened to make a formal complaint to the police watchdog. All up, they spent over A$20,000 (£10,200; $12,400) trying to protect themselves and stop Andy.
Finally a new detective was assigned and within two weeks police were raiding Andy's house. He admitted everything.
Filled with relief, then dread, Hannah began calling her friends to break the news.
"My stomach just dropped," Jessica Stuart says, recalling the moment she learned what Andy had done to her photos.
![A woman in brightly patterned trousers sits on a blue sofa and stares at the camera. She isn't smiling. She is surrounded by patterned, colourful cushions and next to her are several plants](https://ichef.bbci.co.uk/ace/standard/3840/cpsprodpb/013d/live/6f98e310-e5fa-11ef-a819-277e390a7a08.jpg)
Jess says Andy had appeared "thoughtful" and "unthreatening"
"I felt really violated but… I don't think I fully comprehended."
For her, again, the sucker punch was that a friend who she loved like "family" was behind the crime. Andy had always appeared "so unassuming" and "really thoughtful" - someone she'd called for help through a difficult time.
"It's been really hard to reconcile that those two people are actually the same person."
A landmark case
The case was uncharted territory for Australia.
For at least a decade, experts have warned advances in technology would lead to a wave of AI crimes. But authorities have been caught on the back foot, leaving deepfake victims - overwhelmingly women - vulnerable.
At the time Andy was arrested in 2022, there was no offence for creating or sharing deepfake pornography in NSW, or anywhere else in Australia, and the country had never seen a case of this magnitude before.
The 39-year-old was charged with using a carriage service to menace, harass or cause offence - a low-level catch-all offence for many internet crimes - and Hannah was warned to keep her expectations low.
"We were prepared to go to court and for him to get a slap on the wrist," she says.
But she and the 25 other women who decided to be part of the case were determined Andy be held accountable. One after the other, several gave crushing statements at his sentencing hearing last year.
"You didn't just betray my friendship, but you shattered the sense of safety I used to take for granted," Jess told the court. "The world feels unfamiliar and dangerous, I am constantly anxious, I have nightmares when I am able to sleep.
"Forming new friendships feels impossible, burdened by the constant question: 'Could this person be like you?'"
When it came time for Andy to apologise to the women he'd targeted, Jess and Hannah couldn't stomach being in the room. They walked out.
"There is nothing that he can say to me that makes it better, and I wanted him to know that," Hannah says.
Andy told the court that creating the images had felt "empowering" as "an outlet" for a "dark" part of his psyche, but that he didn't think they would cause real harm.
"I have really done a terrible thing and I am so very sorry," he said.
Judge Jane Culver was not convinced of his remorse, saying while there was "some contrition", he didn't seem to understand the clearly "profound and ongoing" suffering that his "prolific" and "disturbing" offending had caused.
She sentenced Andy to nine years in jail - in what has been called a landmark decision.
"The gasp that went through the court... it was such a relief," Jess says.
"It was the first time I felt like we had actually been listened to."
Andy will be eligible for parole in December 2029, but has told the court he intends to challenge his sentence.
![Two women are hugging. They are stood outside a building](https://ichef.bbci.co.uk/ace/standard/2048/cpsprodpb/b1f4/live/9e131490-e5fa-11ef-bd1b-d536627785f2.jpg)
Hannah and Jess hug last year, after a jail sentence was handed down to Andy
Nicole Shackleton, a law expert who researches technology and gender, told the BBC the "unprecedented" case set a surprising, and significant, legal standard for future cases.
The judge had recognised "this wasn't merely something that happened online" and that such behaviour was "tied to offline violence against women", said Dr Shackleton, from Melbourne's RMIT University.
But Australia and other countries remain poor at regulating the use of AI and proactively investigating its misuse, experts like her argue.
Australia has recently criminalised the creation and sharing of deepfake pornography at a national level. But many other countries have legislation accused of containing loopholes, or do not criminalise deepfake pornography at all. In the UK, sharing it is an offence, but creating or soliciting it is not - though this is about to change.
And in the face of under-trained and under-resourced police forces, many victims like Hannah or private investigators - like the one who tipped her off - are left to be de facto detectives and regulators.
In a statement, NSW Police said investigations into AI crimes are a challenging, "resource and time intensive process", and training has recently been beefed up "with the goal that every officer... can respond to these types of crimes effectively".
The force also works with the eSafety Commissioner and tech companies to take down deepfake abuse, the statement added.
eSafety Commissioner Julie Inman Grant said removal of the distressing material is the top priority for most victim-survivors, and eSafety had "an extremely high success rate in achieving it".
But eSafety does not have the punitive powers to pursue criminal investigations and penalties, she added in a statement to the BBC.
![A man with short dark hair, wearing a pale blue plaid shit, and a woman with blonde hair in a black and white striped t-shirt, sit on a sofa outside a building. They are staring at the camera and are not smiling](https://ichef.bbci.co.uk/ace/standard/3840/cpsprodpb/80a8/live/bc8b1170-e5fa-11ef-a819-277e390a7a08.jpg)
Hannah and Kris say they want police approaches to change
"You can have whatever laws you like, [but] if you have a police force that are incompetent..." Kris says, trailing off.
"We're obviously angry at Andy. But it is also disgusting that the only way you get justice with something like this is if you're two people in your 30s that can afford to bully the police."
They're determined for things to be different for future victims. In the past six months alone, two schoolboys in separate cases in NSW and Victoria have been reported to police for allegedly creating mass deepfake nudes of their classmates.
After several years of hell, Hannah is also trying to move on.
But Andy's looming appeal threatens the hard work she's done to rebuild her life and mental health.
Knees at her chest and feet tucked under her on the couch, she says Andy got the sentence he deserved.
"Because for me, and for the other girls, it is forever… they will always be on the internet," she says.
She still pays for a service which scours the web for the pictures, and she worries about future friends, employers, students - her own children - finding them.
One of her biggest fears is that her best memories will never be reclaimed.
"You post things on Facebook and Instagram because they're the happiest moments of your life. You get a dog, you buy a house, you get engaged and you post a photo.
"He had turned every single one of those moments for us into porn. And so when you see that photo… well, now I see myself getting raped."
If you've been affected by the issues in this story, help and support is available via the BBC Action Line