When Abby was in high school, instead of having the normal everyday worries a 16-year-old should have, like passing a quiz or completing an assignment, she was receiving threatening texts from her 18-year-old boyfriend.
Before phones and social media existed, school might have been an escape from the controlling relationship she was in.
“He would text me at all hours of the day, being like ‘What are you doing? Who are you with?’ while I was at school,” Abby said. “He was very domineering; he was quite controlling. I wasn’t allowed to talk to certain people. I wasn’t allowed to do certain things.”
Years after she ended the relationship with her boyfriend, he showed up at the house she had since moved into in Brisbane.
He was able to find her through the location sharing function on Snapchat. Feeling obligated and unsafe, Abby went to dinner with him and realised he hadn’t changed - she hasn’t spoken to him since.
Despite the nature of the threats and stalking, Abby didn’t believe her experiences ‘fit’ with technology-facilitated abuse (TFA) because the threatening texts were things her boyfriend at the time would tell her in person anyway.
“I left because one day he grabbed me by the wrist when we were talking, and he nearly broke my wrist,” she said.
According to Griffith University PhD candidate María Atiénzar-Prieto, “there are many ways in which technology can be an enabler for intimate partner violence and coercive control. We see in research and practice how abuse has evolved and taken on new forms alongside technological advancements.”

She believes technology-facilitated gender-based violence (TFGBV) is a huge issue that needs “urgent attention, given its prevalence and its impacts on victim-survivors”.
According to Safe Steps, TFGBV can appear in many ways but essentially means when technology is used to abuse, harass, coerce or stalk someone.
“People might think that TFA usually involves technology such as hidden wireless cameras or that perpetrators have advanced technological knowledge,” Ms Atiénzar-Prieto said.
“The reality is that the devices that are most commonly weaponised are those that most people use in their everyday lives, including mobile phones, social media apps, location-sharing apps or AirTags, smart home technology, AI image generators or online banking.”
In April last year, Australia saw a spike in gender-based violence related deaths which was declared a ‘national crisis’ by Prime Minister Anthony Albanese.
Thousands of people across the nation rallied the streets to stand up to domestic violence.
In May, the Albanese Government announced that $925m over the span of 5 years will go towards assisting victims in leaving relationships through payments valued up to $5000.
The New South Wales Government has also introduced reforms to enforce harsher penalties for anyone who breaches Apprehended Domestic Violence Orders and broadened the definition of stalking to include the surveillance of someone through the use of technology such as GPS.
Coercive control has been implicitly criminalised in NSW and is expected to be criminalised in Queensland in May.
Other states are at various stages of passing and introducing legislation to make coercive control a crime.
While it is clear action is being taken to combat domestic violence in Australia, technology has become an enabler and escalator in relation to this problem and Ms Atiénzar-Prieto believes the government could be doing more to address the role technology plays in gender-based violence.
“Technology is not something ‘uncontrollable’, something that just evolves without us being able to do anything about the risks and harms that derive from it. People design the technology we use, and perpetrators misuse it.
“I would like governments to call to account big tech companies for their role in technology-facilitated abuse, including gender-based violence,” she said.
Legislation came into effect in September last year to make the sharing of sexually explicit Deepfakes without consent illegal.
According to Australia’s eSafety Commissioner Julie Grant, “explicit deepfakes have increased on the internet as much as 550% year on year since 2019.”
The Scams Prevention Framework Bill 2025 was passed on February 11 and forces providers such as digital platforms and telecommunications companies to be obligated to remove scam content including deepfake technology scams.
With this being such an ever-growing problem, it begs the question: why has this legislation not been introduced sooner?
Laws and legislation such as the Criminal Code Amendment (Deepfake Sexual Material) Act 2024 plays a pivotal role in protecting everyone, especially groups disproportionately impacted by gender-based violence, in particular First Nations women.
According to ANROWS, First Nations women are 32 times more likely to be hospitalised due to family violence as opposed to non-indigenous women.
Some tech companies have taken action to combat TFA such as META being a founding member of Take It Down, a service that supports victims in preventing sexually explicit material from spreading online.
However, Ms Atiénzar-Prieto pointed out that “the business model and the financial interests of these tech companies are often in conflict with safety investment, hindering efforts to combat technology-facilitated abuse”.
META has since announced its contentious plan to remove the third-party fact-checking program on Facebook, Instagram and Threads, a critical component in combatting violent extremist and misogynistic content which without fact-checking, will likely be pushed more frequently in algorithms.
As part of Ms Atiénzar-Prieto’s research, she conducted studies focused on how young people aged 16-25 perceive technology-facilitated coercive control in intimate relationships.
1012 young people were asked to rate how harmful and acceptable they believed a range of digital behaviours to be within certain contexts.
Some of these included “asking for nudes or checking a partner’s phone without their consent, depending on the frequency in which these occurred and the impact it had on the victim”.
There were some positive results indicating that most participants perceived behaviours associated with TFA to be “harmful and unacceptable”.
Results also indicated that boys and young men perceived these behaviours to be more acceptable and less harmful in comparison to girls, young women and gender diverse people.
This type of research plays a key role in navigating and understanding how to prevent TFGBV.
Hopefully with collaboration from researchers, victim-survivors and the government, stories such as Abby’s won’t be as common in the future.
Holding big tech companies accountable is a vital step in making this change happen.
As Ms Atiénzar-Prieto noted, “safety by design principles should be central to ensure that products and services are safe from the start,” ensuring that the tools people use everyday for convenience aren’t weaponised against them.
Comments