On 5 June 2024, Attorney General Mark Dreyfus introduced the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (bill)
The bill amends the Criminal Code Act 1995 (Criminal Code) and focuses on the creation and “non-consensual sharing of sexually explicit material”. This includes content that has been produced or distorted by technology such as deepfakes.
As stated in the Explanatory Memorandum, “As technology advances AI and machine learning even further, the sophistication of deepfake techniques increases, making it almost impossible to detect deepfake material.
“The use of technology … to create fake sexual material poses significant risks to the Australian community, and the non-consensual sharing of this material can have long-lasting harmful impacts on victims.”
In the second reading speech, Dreyfus emphasised that “[d]igitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse.”
Dreyfus also highlighted that “This insidious behaviour is degrading, humiliating and dehumanising for victims. Such acts are overwhelmingly targeted at women and girls and perpetuate harmful gender stereotypes and gender based violence.”
The bill introduces new offences and penalties. It repeals section 474.17A of the Criminal Code and replaces it with a new section dealing with the use of a carriage service to transmit sexual material without consent.
Dr Carolyn McKay, Senior Research Fellow and Co-Director of the Sydney Institute of Criminology at the University of Sydney Law School says, “the bill is repealing some existing offences in the Commonwealth Criminal Code and is moving to a model based on consent and the lack of consent.”
McKay says, “one thing is interesting, [the bill] doesn’t define consent for the purposes of these new provisions… it relies on that sort of ordinary meaning of the term which seems to [comes back] … to the notion of freely and voluntarily consenting to the sharing of [the sexual material].”
Under the proposed new section 474.17A, a person commits an offence if they use a carriage service to transmit material of another person, which depicts or appears to depict the other person in a sexual pose or activity. The person commits an offence if they know that the other person does not consent to the transmission of the material or is reckless as to whether the other person consents to the transmission of the material.
An important aspect of the bill is that it specifically applies to adults and not children. Where the conduct involves children, “the existing child sexual exploitation… offences still apply,” says McKay.
Despite its name, the bill does not specifically refer to artificial intelligence or deepfakes. “… [I]t’s actually been probably sensibly drafted in a fairly broad way and just talks about technology… that would include the current technology that we have today, which would [include] … Photoshop or Photoshop-type technologies, as well as the increasing use of apps, AI apps and deepfake apps,” says McKay.
Multiple studies have demonstrated the harm and distress that can be caused by the non-consensual sharing of sexual images.“… AI is … getting to a point where many people can’t tell the difference between an AI image, a deepfake and reality,” says McKay.
“… [T]his newer legislation, perhaps is more closely aligned and more clearly focused on these emergent technologies,” she says.
In Victoria, police are investigating the creation and circulation of explicit fake images of female students at a Melbourne school.
Dr Asher Flynn, Associate Professor of Criminology at Monash University and a Chief Investigator at the Centre for the Elimination of Violence Against Women (CEVAW) says there needs to be a multifaceted response to the issue.
“[I]t is reflective of cultural and social attitudes towards women and young girls in particular. This is the objectification of them.
“It’s sending a message that, your body essentially, your image is there for me to use in any way that I want,” she says.
According to Flynn, the real issue is that there appears to be a “normalisation” of sexualised content of people without their consent, particularly women and young girls.
Another significant problem is the accessibility of the applications or tools to generate deepfake images. “This is one of the really terrifying elements,” says Flynn. A few years ago, people needed to possess a specific skill set to be able to create these images, have access to specific types of technology, or pay someone else to do it. But now, anyone is able to download an application that will allow them to create such images.
“So, I think that’s one of the scary things that it is more readily accessible, which hopefully this law will go some ways towards preventing,” says Flynn.
“[I]t sends a really clear message to the community that we take deepfake sexualised abuse seriously and there’s going to be implications for people who are using these technologies in really harmful ways.”
Flynn says so far, Victoria is the only jurisdiction to criminalise the creation of deepfake images.
“[T]he focus of this (Commonwealth) law is on the distribution of the imagery,” she says, “[W]hat I found disappointing was that the offence of creating a sexualised deepfake wasn’t made an offence in and of itself.”
So, what causes people to create deepfake images of another person?
In 2022, Flynn and some of her colleagues from different universities across Australia and the UK, conducted research into deepfakes and the prevalence of using digitally altered images as a form of sexual abuse (published in The British Journal of Criminology (2022) 62, 1341-1358). Flynn and her colleagues found that more than 14 per cent of people experienced forms of deepfake and “digitally altered imagery abuse”.
“What our research found was that this kind of abuse varies widely. So, often it can be part of … directly wanting to harm or humiliate the victim or to get back at them for something. But sometimes these images are also being done in a humorous manner. For example… males who have done it to their male friend who is about to get married,” she says.
Notwithstanding the various reasons why people engage in such behaviour, under the proposed bill, there will be penalties for those who are prosecuted and found guilty.
They range from imprisonment of six years for using a carriage service to transmit sexual material without consent to seven years for aggravated offences involving transmission of sexual material without consent. However, as McKay points out, the penalty is only a “maximum” and is not a mandatory sentence. “So, there’s still a level of judicial discretion in relation to sentencing people,” she says.
Flynn says that’s, “considered quite a significant penalty in the context of other forms of offending that we have looked at.
“[I]t will be interesting to see but I think… it is sending that message that we are treating this as a serious form of sexual violence.”