By and -

Snapshot

  • There is strong and increasing evidence that foreign actors are actively co-ordinating social media interference operations in Australia.
  • The Senate Select Committee on Foreign Interference through Social Media made 17 recommendations to better regulate the transparency of social media platforms.
  • Heightened risks arise from social media platforms linked to authoritarian governments, with these platforms requiring greater security to ensure Australian data and security are protected.

The Senate Select Committee on Foreign Interference through Social Media (‘Committee’) released its final report (‘Final Report’) earlier this month, concluding that foreign interference is now Australia’s principal national security threat.

The Committee’s terms of reference were broadly to consider the risks posed to Australian democracy and values by foreign interference through social media, and how to mitigate any identified harms.

The Final Report follows significant controversy surrounding the collection of data by TikTok and its parent company, ByteDance. The Final Report was informed by submissions and evidence provided to the Committee, and the recommendations could (if implemented) result in significant transparency obligations being imposed on social media platforms.

The inquiry

Over the past two decades, social media has solidified itself as an integral part of everyday life; it is used to post updates about people’s lives, connect with friends and family, provide people with news and information, and promote causes individuals are passionate about. As of 2022, it was estimated that approximately 82.7 per cent of the Australian population had active social media accounts.

Although there are undoubtedly positives from the widespread usage of social media, there are also significant concerns. One example is the growing evidence that social media platforms are being used by State, and non-state, foreign actors to conduct interference operations – impeding democracy and undermining human rights.

The Australian Human Rights Commission (‘Commission’) has actively promoted the importance of ensuring that human rights are put at the heart of all technology. In particular, the Commission made a submission to the Committee highlighting the importance of ensuring that social media does not become a tool for foreign interference.

However, before discussing the Committee’s findings, it is important to understand how social media is being used to conduct foreign interference operations.

Foreign interference operations

Given the indispensable nature of social media in the modern world, and the fact that effective regulation has lagged behind the development of new technologies, it is unsurprising that foreign entities have correctly identified social media as an effective and inexpensive tool that can be used to conduct interference operations. Such operations are often aimed at unduly influencing geopolitics, achieving strategic objectives and potentially undermining democratic processes and human rights.

These operations via social media are on the rise in Australia. In its 2020-21 Annual Report, the Australian Security Intelligence Organisation (‘ASIO’) stated that espionage and foreign interference have supplanted terrorism as Australia’s principal security concerns – as digital environments remain the ‘most pervasive vector for espionage’ and ‘[m]ultiple foreign governments are determined to interfere in Australia’s democracy and undermine our sovereignty’.

The Final Report follows significant controversy surrounding the collection of data by TikTok…

AI-driven interference

What is especially concerning is the role that artificial intelligence (‘AI’) plays in these operations, as coordinated inauthentic behaviour (‘CIB’) is regularly employed to influence online users. CIB generally refers to coordinated efforts to manipulate public debate for strategic reasons, where fake accounts are paramount to the endeavour.

CIB is an effective tool in driving interference operations, as the use of AI-generated engagement on social media can instantaneously generate ‘comments’ on news articles, forums or social media posts which push a certain narrative or agenda. This kind of CIB is a key element of Russia’s Internet Research Agency, a St Petersburg-based ‘troll farm’, which was provided a $1.25 million USD monthly budget to interfere with the 2016 U.S. presidential election.

Such manufactured engagement can ‘trick’ social media and search engine trending algorithms by effectively spamming a topic through CIB (Hannah Smith & Katherine Mansted, Weaponised Deep Fakes). This has been an effective strategy, one example of which was the alleged use in 2022 to drown out online acts of defiance in respect of China’s COVID-19 lockdowns. (Stuart Thompson, et al., ‘How Bots Pushing Adult Content Drowned Out Chinese Protest Tweets’).

Senate Committee findings

Social media has become a new digital battleground as foreign actors endeavour to improperly interfere in decision-making processes and unduly influence people and nations across the globe.

Risks to democracy and Australian values led to the Committee outlining 17 recommendations in the Final Report. Broadly speaking, these recommendations focused on addressing these risks through reforms designed to ensure:

  • transparency obligations are imposed on social media platforms;
  • the usage of TikTok and WeChat is curtailed in certain circumstances;
  • audits/mapping of the security risks of social media platforms;
  • designation of an entity to counter cyber-enabled foreign interference;
  • law reforms to strengthen Australia’s national security;
  • countering and disrupting AI-generated disinformation and foreign interference campaigns;
  • production of educative guidance materials to better inform the public on how to critically examine content online; and
  • promotion of digital literacy in the Indo-Pacific region.

Key recommendations: Transparency obligations

One of the key recommendations to emerge from the Committee was the introduction of a requirement that all large social media platforms operating in Australia meet minimum standards for transparency.

These obligations would require that all large social media platforms must:

  • proactively label state-affiliated media;
  • be transparent about the content they censor or take down;
  • disclose any government directions they receive about content on their platform;
  • disclose cyber-enabled foreign interference activity;
  • disclose any takedowns of CIB and how/when the CIB was identified;
  • disclose any instances where a platform removes or takes adverse action against an elected official’s account;
  • disclose any changes to their platform’s data collection practices or security protection policies as soon as reasonably practicable;
  • make their platform open to independent cyber analysts and researchers to examine foreign interference;
  • disclose which countries they have employees operating in who could access Australian data and provide information about data collection and usage; and
  • maintain a public library of advertisements on their platform.

These obligations place serious obligations on social media platforms and, if the recommendations are adopted, will likely require them to make substantial changes to the way their platforms operate. However, such reforms are necessary to ensure an active and firm response to the risks posed by foreign interference.

In recommending these transparency obligations, the Committee was particularly critical of TikTok (whose parent company is ByteDance). The Committee claimed TikTok had attempted to obfuscate, and avoid answering questions about, ByteDance’s relationship with the Chinese government.

Social media has become a new digital battleground…

TikTok’s conduct during this inquiry was cited as just one of the reasons the Committee recommended that, should a social media platform fail to meet the enforceable transparency standards, they will face fines.

Repeated failure to comply with the transparency requirements could also, as a last resort, result in a social media platform being banned in Australia by the Minister for Home Affairs. Any ban would be made via a disallowable instrument and must be reviewed by the Parliamentary Joint Committee on Intelligence and Security.

This ‘last resort ban’ builds on other recommendations, including that, due to the espionage and data security risk posed by TikTok, the government broaden its ban of TikTok. As of April 2023, TikTok was banned on government user devices. The Committee’s recommendation goes further and seeks to ensure that the app be banned on all government contractors’ devices with access to government data.  

Key recommendation: Ensuring an Australian presence

Another recommendation influenced by TikTok and WeChat’s conduct during the inquiry was that all large social media platforms must have a direct Australian presence.

The Committee cited evidence that China’s 2017 National Intelligence Law means the Chinese government can legally require companies to cooperate with intelligence agencies. In the Commission’s submission, it was noted that there have been allegations that ByteDance used TikTok to track the physical location of multiple Forbes journalists who were reporting on the company as part of a covert surveillance campaign. This followed an earlier investigation by BuzzFeed News which concluded China-based TikTok employees had access to U.S. user data and repeatedly accessed that data.

The Committee heard evidence that China-based employees can, and have, accessed Australian user data – however TikTok was unwilling to provide the Committee with further evidence relating to this question. This unwillingness was a point of tension for the Committee as social media companies based in foreign countries, including China, were reluctant to engage with the Australian inquiry. TikTok was reluctant to provide witnesses and was (in the Committee’s estimation) evasive in their answers, while WeChat refused to appear before the Committee.

The Committee strongly recommended that social media companies which operate in Australia must establish a physical presence in the country to ensure they are accountable under Australian laws.

Key recommendation: Divestment

Following the U.S. government’s inquiries into ByteDance, the Committee found that, should the U.S. force ByteDance to divest ownership of TikTok, Australia should consider imposing similar requirements.

A new era of tech-regulation

These proposed obligations would require significant action on the part of social media platforms, but this is necessary to address the serious risks that have been identified. The Committee’s findings do not seek to curtail the usage of social media by everyday people – but rather foreign actors who seek to assert their political objectives to the detriment of Australia and our democratic values.

The tech industry, and Silicon Valley wunderkind, have for too long proclaimed that self-regulation is the panacea to the harms of social media. The Committee’s findings demonstrate that self-regulation alone is no longer, and never was, the solution.



Lorraine Finlay
is the Human Rights Commissioner and Patrick Hooton is Human Rights Advisor (Business and Technology) at the Australian Human Rights Commission.