Investigation into tackling deepfake must go further and remove ‘nudify’ apps

Creating and sharing fake, sexual images could soon become a lot harder as an online watchdog targets problematic apps that primarily target women.
The generation of “deepfakes” is growing rapidly thanks to technology driven by artificial intelligence (AI).
Deepfakes involve digitally editing images of a person or their body. Using artificial intelligence (AI), an image can be generated from a photo or faces can be added to pornographic material.
Several experts outlined the potential dangers during a parliamentary hearing on Tuesday.

Nicole Lambert of the National Association of Services Against Sexual Violence says young people have committed suicide after being victims of deepfake material.

Call for tougher penalties for tech companies

Rachel Burgin of Rape and Sexual Assault Research and Advocacy provided an overview of a survey of perpetrators.
Many respondents admitted that criminal sanctions would be the greatest deterrent to committing abuse.

“What we do in Australia in terms of prevention is not working. That’s why more than 50 women have been murdered by men this year,” she told the committee.

Deepfakes were often accompanied by doxxing, where personal information was shared online.
People feared for their safety as a result, as sexual violence was a precursor to murder, she said.
Abuse victim Noelle Martin criticized tech companies and social media platforms for failing to take down such material or search results, while billions of people visited the top 40 sites with non-consensual nude photos.

She called for heavy fines and possible criminal liability.

Urging removal of ‘nudify’ apps

The eSafety Commissioner said she welcomed being given the power to remove apps primarily aimed at making women “nude” or creating synthetic child sexual abuse material.

“Some may question why apps like this are allowed to exist at all, given that their primary purpose is to sexualize, degrade, demoralize, denigrate and create sexual abuse of children,” Julie Inman Grant told the hearing.

“These apps make it easy and costless for the perpetrator, while the price for the target is sustained and incalculable destruction.”
The use of deepfakes to control women in abusive relationships was also investigated.

“People create and recreate images of their partners as a way of exerting control over them in a context of domestic violence,” Burgin said.

Women often targeted

The study found that more than 96 percent of deepfakes targeted women.
According to law professor Rebecca Delfino, an incident in 2020 occurred in which more than 680,000 women received nude photos of themselves generated and shared by an AI chatbot.

A cybersecurity firm that monitored deepfake videos found that 90 to 95 percent of the videos were non-consensual porn, Delfino said.

'It was all just abuse': How pornography fuels violence

American singer-songwriter Taylor Swift became the most high-profile celebrity ever .
In the same month, Victorian Animal Justice Party MP Georgie Purcell had her which the network claims was “accidentally modified by Photoshop.”

Proposed criminal sanctions

The Albanian government wants to criminalize the transfer of sexual material involving adults without their consent.
The violations concern both unaltered material and content produced using ‘deepfake’ technology.

The committee heard that the changes to the law should cover the production of images and threats to produce such material.

Attorney General Mark Dreyfus argued the Commonwealth had legal limits on what it could tackle, but Michael Bradley, managing partner at Marque Lawyers, believed the government had the authority to expand the bill.
“In the area of ​​terrorism, there are some pretty broad crimes that criminalize accessing and creating content, so I don’t think it’s that big of a challenge,” Bradley said.
The committee will report by 8 August at the latest.
If you or someone you know is experiencing sexual abuse, call 1800RESPECT on 1800 737 732 or visit In case of emergency, call 000.
Readers seeking crisis support can contact Lifeline on 13 11 14, the Suicide Call Back Service on 1300 659 467 and Kids Helpline on 1800 55 1800 (for young people under 25). Further information and support relating to mental health is available at and on 1300 22 4636.
supports people with diverse cultural and linguistic backgrounds.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *