news

Hannah once saw Andy as a trusted friend. Last year, she helped send him to prison.

This story discusses domestic violence and sexual harassment.

In June 2024, Hannah Grundy sat in the front row of a Sydney courtroom and locked eyes with the confessed criminal sitting just metres away.

This man, Andrew Thomas Hayler, had used images of 26 women to create disturbing 'deepfake' pornography that he then shared online.

In many cases, he had taken ordinary photos from the women's social media accounts and used editing software or artificial intelligence to superimpose their faces onto explicit content. He then uploaded these altered images to a pornographic website.

Watch: South Korea faces deepfake porn 'emergency'. Post continues after video.


Video via BBC News.

Some of Hayler's posts also featured graphic descriptions of violent sexual fantasies. Many even contained identifying details including the women's full names, occupations, the suburbs where they lived, and links to their social media profiles.

Hannah, a high-school science teacher, was amongst those 26 women. She was integral to the case against Hayler, a man who, only two years prior, she believed to be her trusted friend.

ADVERTISEMENT

Speaking to Mamamia's No Filter podcast, Hannah, 35, described being struck by how the man who betrayed her so deeply seemed comfortable meeting her gaze in court.

"If you looked at him, he didn't put his head down, he would look right back at you," she said. "There was no shame there."

Listen to the full episode of Hannah Grundy's No Filter interview. Post continues after podcast.

What is 'deepfake' pornography?

'Deepfakes' are photos or videos that have been altered using artificial intelligence to create the impression a person did or said something they didn't do or say. They are extremely realistic, and it's often hard for the average person to tell they aren't real.

Deepfakes can be circulated online for political or ideological causes, but the most common use is for the creation of pornography.

In 2023, researchers from RMIT surveyed more than 16,000 people around the world and found that 3.7 per cent of Australian respondents had been a victim of deepfake porn as an adult — the highest proportion amongst the 10 countries they surveyed.

How Hannah helped unmask a criminal.

Hannah Grundy discovered she too had been victimised courtesy of an anonymous tip-off.

In late 2021 and early 2022, she received two mysterious messages in her inbox warning her that explicit photos of her had appeared online. She dismissed the emails as a scam. Then, three months later, came another.

"It had a link in it, and it said, 'The above link contains disturbing material. Beware,'" Hannah said.

ADVERTISEMENT

Nervous about what she may find, Hannah asked her partner, Kris Ventura, to open the link. "He just completely went white. I've never seen him look so shocked," she said.

story of hannah grundy deepfake porn victim in australiaHannah Grundy with her partner, Kris Ventura. Image: Supplied.

The anonymous email (which Hannah later learnt had been sent by a private investigator based in New Zealand) led to a forum-based pornographic website where dozens of deepfake images made it seem as if Hannah had participated in "hardcore, degrading, violent" porn.

ADVERTISEMENT

Read: '150 seconds to death'. The dangerous sex trend that's become mainstream for young people.

Someone had taken snapshots of her life, moments worth documenting and celebrating, and manipulated them for their own sexual gratification.

"It was immediately obvious that someone had spent hours and hours and hours doing it," she said. "Some of them were really pathetic efforts at Photoshop, and then some of them were convincing AI."

It went beyond images, though. There was a thread called "The Destruction of Hannah".

"At the top there was a poll, and it said, 'How would you rape Hannah?' And then it had seven or six options," Hannah said. "They were very graphic. And then hundreds and hundreds of men had voted on it."

Hannah and Kris continued scrolling through the site in horror, and Hannah found other faces she recognised — friends and colleagues from a bar at Sydney University, where she had worked a decade prior. Cross-checking the posts against social media accounts, Kris and Hannah realised who was responsible: Andrew Hayler. They'd remained close friends with "Andy" since working at the bar. He'd been to their house, they'd taken holidays together.

Hannah approached the police, but there were months without meaningful progress in the case. In the meantime, Hannah and Kris lived in a state of fear and vigilance.

ADVERTISEMENT

"I was terrified all the time," she said. "We bought cameras for the house. I wore an Apple Watch that would tell Kris when I arrived to school or when I got home. It was summer, and I wouldn't keep the windows open, because the things that Andy had written about me were so graphic, and they seemed so real."

She said she and Kris also slept with a knife next to their bed — just in case.

Amid it all, Hannah and Kris had to maintain the appearance that everything was normal. They remained friends with Hayler on social media, and they even bumped into him at a local football event. Hannah said he was "super friendly". She said he hugged them, and he asked about work and the house the couple had recently bought together. 

This was the man who continued to post threatening messages about her online saying that he knew where she lived and worked and that he was "closing in on the slut".

"The weird thing is that it was still just my friend Andy. You're so angry," she said. "But you look him in the eyes, and it's still the friend you've been friends with for 10 years. It did not make any sense that they were still the same person."

Read: 'I wrote a book about sex dolls. What I discovered was far more disturbing than I thought.'

Months passed and, with little progress in the investigation, Hannah and Kris engaged the services of a lawyer and forensic investigator to bolster their case. Within weeks of the investigator submitting his report to police, Andrew Hayler's house was raided and his devices seized. 

ADVERTISEMENT

On June 13, Hayler pleaded guilty to 28 counts of using a carriage service to cause offence — a broad charge that covers a variety of internet-based crimes.

Doorstopped outside of court, Hayler told an ABC reporter that he was "really sorry" for what he did to his 26 victims. "A dark part of my psychology that came out, manifested," he said.

After hearing the victim impact statements of seven women read out in the NSW District Court, Hayler took to the stand and formally apologised for his crimes. Hannah and some of the other victims chose to leave the courtroom when he spoke — she didn't want to hear his justifications or apologies.

Hayler had another opportunity to express remorse a week later as he prepared to be sentenced. While Hannah, Kris, other victims and their supporters were waiting in the foyer at the District Court, Hayler was brought right past them.

"He saw us all sitting there. I was kind of close to him. He looked at my partner, Kris, and he told Kris he was sorry," Hannah said. "I just thought it was very interesting that you pick the only male partner there to say sorry to instead of me. I was his friend. Kris wasn't even his good friend."

Andrew Hayler was sentenced on June 21, 2024, to nine years in prison. That includes a non-parole period of five and a half years. It was one of a small number of cases involving deepfake pornography that has been heard in Australia.

ADVERTISEMENT

Legal experts described the ruling as unprecedented.

For Hannah, it was stunning. She was under the impression that Hayler wouldn't receive a prison term.

"When [Judge Jane Culver] said that, we gasped. And everyone cried. We had [other victims] over the other aisle, and we ran into the middle and hugged," she said. "I'll never forget it."

Hannah did take a moment to glance at Hayler, too.

"He did nothing," she said. "He had no reaction."

In August 2024, two months after Andrew Hayler's sentencing, the Australian Parliament amended the criminal code to include an offence that specifically references deepfake pornography. Those convicted of creating or sharing such content now face up to seven years behind bars.

Read: 'This is what pain looks like.' What happened when Blaire found deepfake porn of herself.

Speaking about the new laws at the time, Attorney-General Mark Dreyfus said, "Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse.

"This insidious behaviour can be a method of degrading, humiliating and dehumanising victims. Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence."

Hannah described the change as "so important".

ADVERTISEMENT

"I hope it means that women or men who have this happen to them don't have to absolutely battle through to get any kind of justice," she said.

All told, Hannah spent roughly $20,000 pursuing the case. And the ordeal is not over. Kris and Hannah pay for a subscription service that searches the internet for her face, and she said the images and her name continue to pop up on other porn websites where they've been copied and shared by other users and bots.

"Police have been able to take most of them down, but they're only taking the ones down that we've found. And so it's like playing Whac-A-Mole," she said.

"It's going to be something I think I contend with for my whole life. If I have children or if I get a new job, it's always going to be something that I have to explain to people, because they will be there forever."

If this has raised any issues for you, or if you just feel like you need to speak to someone, please call 1800 RESPECT (1800 737 732) – the national sexual assault, domestic and family violence counselling service.

Mamamia is a charity partner of RizeUp Australia, a national organisation that helps women, children and families move on after the devastation of domestic and family violence. Their mission is to deliver life-changing and practical support to these families when they need it most. If you would like to support their mission, you can donate here.

Feature image: Supplied.

00:00 / ???