With thousands of teenagers due to be removed from social media this week, questions remain around how the legislation is being implemented and what the impact will be on young people.
From 10 December, social media platforms will be required to take reasonable steps to ensure people under the age of 16 do not hold an account, with fines of up to almost $50 million for systemic breaches.
While the measures mean people under 16 will be prevented from holding an account, they will still be able to view content available in a logged-out state.
The new rules were introduced through an amendment to the Online Safety Act 2021, known as the Social Media Minimum Age (SMMA) bill.
The bill has been described by the federal government as “a landmark measure that will deliver greater protections for young Australians during critical stages of their development”.
Professor of Law at Western Sydney University, Elizabeth Handsley, who is also President at Children and Media Australia, says young people face a range of dangers online.
These dangers include exposure to violent, extremist, or pornographic content, but they also relate to interactions with other users.
This can range from harassment, bullying, or trolling from peers, to blackmail, scams, radicalisation and grooming from older users.
Perhaps the biggest danger, she says, is the persuasive design of social media platforms, leading to overuse.
Without commenting on how effective the SMMA legislation is likely to be in directly protecting young people from these dangers, Handsley says the law has an important role in education.
“The amendment represents a very strong statement, to parents and young people as well as the companies who are bound by it, that social media is not a good place for children of this age,” Handsley says.
“It may take some years before we see the impacts of this statement, but we know that parents are glad that they will be able to point to the existence of the legislation when setting boundaries within their own families.”
Despite this, there has been some resistance to the legislation.
“This law suggests that democracy begins at 16, which is entirely condescending and incorrect”
High Court challenge to the SMMA
On 27 November, a challenge was filed in the High Court of Australia against the legislation.
The challenge is being led by the Digital Freedom Project (DFP) – a group that has a stated objective to promote and protect the participation of young Australians in public affairs and communication.
Alongside the DFP as part of the challenge are Macy Neyland and Noah Jones, both 15 at the time the submission was made.
Last week, the court agreed to hear a special case between the group and the commonwealth as early as February 2026.
Neyland, Jones and the DFP argue that the SMMA legislation is unconstitutional because it impinges on the freedom of political communication for young people “who exercise their freedom to engage in communication on political and governmental matters by registering accounts on social media services”.
“The exercise of this freedom of political communication by these citizens is necessary for their education in political and governmental matters and in preparation for their exercise of voting rights in choosing political representatives upon them becoming entitled to vote,” their submission to the High Court reads.
“Logged-out viewing does not provide a meaningful substitute for the interactive functions.”
They are pushing to have the legislation found to be invalid or, alternatively, to have the legislation read down so it doesn’t apply to political communication.
More than a means of communication, Neyland says social media platforms are “social spaces, creative outlets and sources of information in ways that offline options often cannot match”.
“This law suggests that democracy begins at 16, which is entirely condescending and incorrect,” Neyland says.
“You cannot empower young people for democracy by removing them from participating in it.
“Imagine silencing a future Greta Thunberg.”
Handsley says this case will largely be fought on the grounds of what constitutional freedom of political communications is designed to achieve – to ensure that people can make an informed choice when they vote in elections.
“Considering most of the people affected won’t, by definition, be voting in the next election, I think it’s going to be an uphill battle,” Handsley says.
“I also don’t buy any suggestion that social media is the only possible way that young people can access political information.
“While it might be true that young people have only ever accessed news through social media, there is every reason to believe they can and will find other sources if that’s no longer available.
“It’s our responsibility as a society to facilitate that kind of discovery.”
“You cannot empower young people for democracy by removing them from participating in it”
Confusion around what platforms are impacted
Handsley has reservations about the “bizarre” way that the government has been developing and implementing the legislation.
Only websites and apps that can be defined as “age-restricted social media platforms”– conditions for which are outlined in the bill – will be subject to the SMMA legislation.
While there is no definitive list of platforms that will be affected, the government’s eSafety website has published a list of platforms that, in their view, will be.
This list includes major players like Facebook, Instagram, Snapchat, and TikTok.
However, the website also states that eSafety has no formal role in declaring which services are age-restricted.
“In the absence of any rules made by the Minister of Communications specifying a service is either an age-restricted social media platform or not an age-restricted social media platform, any determination that a service is or is not an age-restricted social media platform is a matter for the court,” the eSafety website reads.
This ambiguity has correlated with a surge in downloads of similar social media platforms, such as Lemon8 and Yope, which have not been named on the eSafety website.
“My understanding is that the government wanted to give a clear indication to some platforms that they were definitely included,” Handsley says.
“This seems to have backfired and led some people (including maybe the non-named platforms themselves) to believe that anybody else is excluded.”
Handsley’s concern around platforms that are not covered by the legislation extends to those that have been excluded.
When the bill was introduced, the government outlined plans to make legislative rules that would exclude certain kinds of services from being covered by the SMMA bill.
This manifested in the Online Safety (Age-Restricted Social Media Platforms) Rules 2025, created on 29 July 2025, which excludes:
- Messaging, email, voice calling or video calling services
- Online games
- Services that primarily function to enable access to information about products or services
- Professional networking and professional development services
- Education and health services
“I’m deeply troubled by the exclusions,” she says.
“Ideally the government would have started with strong and sweeping restrictions and then allowed exemptions over time for platforms that showed they could behave in a responsible way.
“I’m very sad to have seen it squander this opportunity.”
The eSafety website has published a list of platforms it considers will not be age-restricted, including YouTube Kids, Roblox, Messenger, WhatsApp, and Discord.
Despite these rules, or perhaps because of the ambiguity around them, Neyland says her peers think the legislation “will be like the threat to ban TikTok that did not eventuate”.
Creating a safe environment for young people
Despite the SMMA legislation being designed to put the onus back on social media companies, young people will still be significantly affected.
The role of government and platforms in facilitating safe online fora is an area Handsley and the DFP find common ground.
The DFP says that the broad SMMA legislation is “categorically excluding an entire age cohort” and “an oppressive, overreaching and inappropriate means to achieve the object of child protection”.
It argues that an enforceable duty of care would be a more reasonable and compelling alternative to age restrictions.
A duty of care was one of the recommendations to come out of a statutory review of the Online Safety Act 2021.
The Australian Government is now attempting to legislate a Digital Duty of Care, through the Online Safety Amendment (Digital Duty of Care) Bill 2024, currently before parliament, which aims to:
- Safeguard the wellbeing of Australian citizens, society, and democracy against the risk of misaligned digital systems and processes
- Increase transparency regarding the way in which digital communications platforms manage and inform regulators and the Australian public on their practices
- Empower users of digital communications platforms to control their user experiences, including the types of content they receive, particularly where it may be risk-laden.
While Handsley agrees that an effective duty of care would have been preferable to the SMMA legislation, “it’s not the situation we have”.
“Let’s have a digital duty of care – a strong and broad-ranging one, and not the limited one the government seems to be proposing – and see how it works,” Handsley says.
“If it’s strong enough, there might well be a case for relaxing protections for under-16s.”
Neyland points to a range of AI options – used to identify inappropriate content – that platforms could employ to make online environments safer for young people.
“This approach works by still leaving us with the good, the areas we have a human right to,” Neyland says.
“But more than that, it also makes better allowances for future changes.”
Header image credit: LUKAS COCH/AAP Image
