Outline:
The Impact of Australia’s Social Media Ban on Minors
Australia has taken a significant step in protecting its youth from the potential harms of social media by implementing a law that bans children under 16 from using major platforms. As a result, social media companies have revoked access to approximately 4.7 million accounts linked to children in the country. This move has sparked discussions about the role of technology in young people’s lives and the responsibilities of online platforms.
The initiative was championed by Communications Minister Anika Wells, who expressed pride in the efforts made to enforce the ban. “We stared down everybody who said it couldn’t be done,” she stated, acknowledging the challenges posed by powerful tech companies. She emphasized that Australian parents can now feel more secure knowing their children are protected from harmful online environments.
This figure, shared by ten social media platforms with the Australian government, highlights the scale of the law’s implementation since its introduction in December. The legislation has ignited debates across the nation regarding technology use, privacy, child safety, and mental health. It has also prompted other countries to explore similar measures.
Under the new law, platforms like Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, X, YouTube, and Twitch face substantial fines if they do not take reasonable steps to remove accounts of children under 16. However, messaging services such as WhatsApp and Facebook Messenger are exempt from these restrictions.
To verify the age of users, platforms can request identification documents, utilize third-party age estimation technology, or make inferences based on existing data, such as how long an account has been active. These methods aim to ensure that only eligible users can access the platforms.
According to Julie Inman Grant, the eSafety Commissioner, there are approximately 2.5 million Australians between the ages of 8 and 15. Past estimates indicated that 84% of children aged 8 to 12 had social media accounts. While the exact number of accounts across the 10 platforms is unknown, the reported figure of 4.7 million “deactivated or restricted” accounts is seen as a positive sign.
“We’re preventing predatory social media companies from accessing our children,” Inman Grant remarked, highlighting the importance of this measure in safeguarding young users.
All ten major companies affected by the ban have complied with the law and provided removal figures to the regulator on time. The commissioner noted that these companies are expected to shift their focus from enforcing the ban to preventing children from creating new accounts or finding ways around the prohibition.
While the figures are not broken down by platform, Meta, which owns Facebook, Instagram, and Threads, reported that nearly 550,000 accounts belonging to users under 16 were removed shortly after the ban took effect. However, Meta criticized the ban, suggesting that smaller platforms not covered by the law might not prioritize safety. The company also pointed out that browsing platforms could still expose children to content through algorithms, a concern that led to the ban’s enactment.
The law has received widespread support from parents and child safety advocates. However, online privacy advocates and some groups representing teenagers have opposed it, arguing that online spaces provide crucial support for vulnerable young people or those in remote areas of Australia. Some individuals have managed to bypass age-assessing technologies or have been assisted by parents or older siblings to circumvent the ban.
Since Australia began discussing the measures in 2024, other countries have considered following suit. Denmark’s government, for instance, announced plans to implement a social media ban for children under 15. Prime Minister Anthony Albanese expressed pride in the success of the law, noting that it is being replicated globally.
Opposition politicians have raised concerns about the ease with which young people can bypass the ban or migrate to less scrutinized apps. Inman Grant acknowledged that there was a spike in downloads of alternative apps when the ban was enacted, but no significant increase in usage was observed. She emphasized that while no long-term trends have been identified yet, her office is actively engaging with the issue.
Looking ahead, Inman Grant mentioned that the regulator plans to introduce “world-leading AI companion and chatbot restrictions in March.” Although specific details were not disclosed, the move signals a continued commitment to enhancing online safety for children.
