By -

The Australian Government has released new, detailed guidance for social media platforms ahead of a landmark law coming into force on 10 December 2025, requiring platforms to prevent children under 16 from having an account. The new Social Media Minimum Age (SMMA) laws place the responsibility squarely on companies to take "reasonable steps" to ensure compliance or face potential fines of up to $49.5 million.

Communications Minister Anika Wells said the new guidance clarifies the government’s expectations. “The government has done the work to ensure that platforms have the information they need to comply with the new laws – and it’s now on them to take the necessary steps,” she stated, highlighting that the new rules are designed to be “effective, private, and fair” for Australian users.

According to eSafety Commissioner Julie Inman Grant, the guidance takes a principles-based approach, acknowledging that there’s no single solution for the diverse range of platforms. It promotes a “layered approach” to age assurance, urging companies to use a combination of systems, technologies, and policies instead of relying solely on one method.

“As we work towards implementing this world-first legislation, we remain deeply engaged with industry to ensure they have all of the information they need to comply,” Inman Grant said.

The guidance outlines several key expectations. Platforms are expected to be proactive by detecting and deactivating underage accounts with clear communication and preventing those users from re-registering. The guidance explicitly states that companies cannot rely solely on a user’s self-declared age; instead, they should use a mix of methods, like age estimation based on behavioural data, to ensure accuracy and reduce user frustration.

A critical component of the new rules is user choice and privacy. Platforms must give users a choice of age assurance methods and an accessible way to appeal a decision if they believe they have been wrongly flagged. Notably, the guidance prohibits companies from using government-issued identification as the sole method for age verification, mandating that they must always provide reasonable alternatives. The eSafety Commissioner also requires companies to adopt privacy-preserving practices, meaning they should not retain personal data from individual age checks. Record-keeping should focus on the systems used, not on data tied to specific users.

The legislation and the new guidance are the culmination of a long process that has included extensive consultation with industry, stakeholders, parents, and young people, as well as a government-led Age Assurance Technology Trial. The new laws apply to services where a “significant purpose” is to enable social interaction, and which allow users to post material and link to or interact with others. Services such as messaging apps, online games, and educational or health services are excluded from the new obligations.

The government has stressed that the legislation puts the responsibility on the platforms, not on parents or young people. “Parents, kids – indeed the entire Australian community – are relying on them to keep young Australians safer online,” Minister Wells said.

“Respect for and commitment to the rights of the child underpin our guiding principles and should be front of mind for platforms when implementing measures to meet their obligations. This will be a significant change for many young Australians, and we must be prepared to provide the right scaffolding and approaches to support them,” Inman Grant added.