The Prime Minister Anthony Albanese has announced that the government will introduce legislation to impose a minimum age for access to social media and certain digital platforms.
This follows the release of the report commissioned by the South Australian Government examining social media use by children including its effect on mental health, wellbeing and development.
The “Report of the Independent Legal Examination into Banning Children’s Access to Social Media”, was ordered by South Australia’s Premier Peter Malinauskas.
“The evidence is clear, social media is causing our children harm. … my intent is clear, we are going to do something about it,” said Malinauskas.
“We now have a pathway forward to implement a ban on social media platforms allowing children under the age of 14 to have accounts, and to require parental consent for 14-and 15-year-olds,” he said.
The report was prepared by the former Chief Justice of the High Court, the Honourable Robert French AC.
“The issue of protecting children from the harms of social media is one of global concern,” says French.
The report contains a proposed bill outlining a legislative framework to ban social media for children under 14 and will require social media companies to determine parental consent before allowing children aged 14 and 15 to use their platforms.
Duty on platforms
The proposed bill, The Children (Social Media Safety) Bill 2024 places a positive obligation and duty on social media platforms to restrict access to their services by children of a certain age.
It also creates a duty to require providers to take reasonable steps to prevent access by children within the restricted age ranges.
The proposed bill outlines the ways in which the duties of care would be enforced including payment of compensation by the provider, imposition of a civil penalty, injunctive relief, damages and more.
“The harms of social media outweigh the benefits for kids, I would say, even if this is not true of all kids,” says Dr Sacha Molitorisz, Senior Lecturer in the Faculty of Law at the University of Technology Sydney.
He acknowledges that there is substantial research about the mental health impacts of social media use on teenagers particularly on certain minority groups of teenagers and young girls.
As a parent himself, he says “it’s a good starting position to say, right let’s ban kids from social media. Of course, it becomes difficult to enact and to enforce. That’s where it gets tricky.”
“Many of the issues that we face today can be traced back to the use and abuse of data, and I’d strongly argue against any use of facial recognition tech to verify age, particularly if such verification is mandatory,” he says.
Given the interaction between internet usage, privacy and data, Molitorisz agrees that there is a need to regulate and he says that “We need to switch from this idea of caveat emptor where the user is the one considered responsible for how they engage online.”
“Let’s hold digital platforms and services much more responsible,” he says.
“The bill that has been proposed in South Australia imposes a positive obligation and a duty on social media platforms to prevent access to their services by an individual child.”
Along with colleagues including Michael Davis, also from UTS Law, Molitorisz is exploring the idea of an alternative model to a blanket ban. They suggest the drafting of a Digital Platforms Act that imposes a duty of care on digital platforms and services. Each service would be required to set an age that’s appropriate and is approved by a regulator. Davis and Molitorisz suggest that this duty of care might involve a code of conduct covering social media services for children and this code should be developed with public consultation.
Molitorisz believes there needs to be a holistic view of our interactions with the internet and our digital interactions because “so much of our life is lived online these days.”
Comparison with overseas jurisdictions
Australia is not the first nation or jurisdiction to contemplate online safety legislation. The European Union and the United Kingdom have implemented laws to protect users, both children and adults, online.
European Union
The Digital Services Act (DSA) was adopted in the European Union on 19 October 2022, and it came into force on 16 November 2022. Services were given until February 2024 to adhere to the provisions.
The DSA imposes obligations on social media platforms such as Instagram, Snapchat, TikTok and YouTube and search engines like Google, to be more proactive in safeguarding users’ rights and refrain from spreading illicit or unsuitable content. It requires social media platforms to contemplate the effect of their services on elections, public safety, the mental and physical wellbeing of users and gender-based violence.
The DSA also embodies obligations under international conventions by protecting the “best interests of the child” and the right to protection for the child.
Article 28 of the DSA states that platforms, which can be accessed or used by minors, need to ensure that their services contain “a high level of privacy, safety, and security of minors, on their service.”
The DSA requires service providers to “identify and assess” likely online risks for children and young people using their services including implementation of measures to alleviate some of the risks. For instance, implementing parental controls to assist parents and carers to screen or restrict children’s access to the internet, age verification procedures to verify the age of users before they can utilise the service, and introduction of tools to assist young people to report abuse or obtain support.
The DSA (Recital 67) also prohibits “dark patterns,” which are “online interfaces” used to influence users into doing things they may not wish to do like making purchases, affect decisions, or making it difficult to cancel subscription services.
The DSA grants the Commission greater powers to monitor social media platforms and service providers. Corporations can be fined up to 6 per cent of their annual global earnings if they fail to comply.
United Kingdom
The United Kingdom has also taken steps to protect users online. It passed the Online Safety Act 2023 (Act) on 26 October 2023. The Act seeks to protect children and adults online and it places new obligations on social media companies and search engines by making them responsible for their users’ safety while using their platforms.
Significantly, the Act provides new duties or obligations to invoke and execute systems and procedures to lower the risk of their services being used for illegal activity and to remove illicit material. Social media platforms will be required to monitor and impose age limits to protect their child users so that child users have “age-appropriate experiences and are shielded from harmful content.”
Furthermore, the UK Government took proactive steps to protect women and girls, acknowledging they were disproportionately affected by illicit online content or material. The Act requires services and platforms to “proactively tackle” harmful and illicit material such as harassment, stalking, controlling or coercive behaviour, extreme pornography and revenge pornography.
The Office of Communications (Ofcom) is the regulator of online safety and can enforce the Act by commencing action against corporations who do not comply with their duties. Corporations can be fined up to 18 million pounds or 10 per cent of their global revenue, whichever is greater.
According to the UK Government, the Act “will make the UK the safest place in the world to be a child online.”
What about the benefits of social media?
The University of Sydney, Youth Action and Student Edge (Youth Insight) has conducted research into online safety usage by young people. They found that young people use a range of platforms and apps for different purposes and to keep in touch with different family and friend circles. They also possess a broad range of skills in keeping themselves safe online such as controlling who they connect with and changing privacy settings.
The results were published in the report “Emerging Online Safety Issues: Co-creating Social Media Education with Young People” (published September 2023). The report found that while parents and young people want platforms to be more responsible for their children’s online safety and improve transparency, “Neither parents or young people expressed a high level of confidence in their understanding of privacy data or how this was being managed by technology companies.”
The report highlighted that young people want to be consulted and play a role in developing online safety policies and laws. And there were concerns about the impact of imposing age restrictions. Participants felt there could be a loss of agency and freedoms, if the differences in maturity and capacity among young people were not taken into account.
The report found that culturally and linguistically diverse young people were more likely to “join social media to socialise and learn about the things going on in the world,” and young people living with a disability were more likely to join social media for entertainment.
A Joint Select Committee on Social Media and Australian Society was formed in May this year to “inquire into and report on the influence and impacts of social media on Australian Society.” The inquiry will examine a number of issues including “the use of age verification to protect Australian children from social media.” The final report is due on 18 November 2024.