Watchdog Urges Social Media Platforms to Strictly Enforce Australia’s Under-16 Ban

The world’s largest social media companies are not doing enough to prevent children in Australia from accessing their platforms, according to the country’s internet regulator, despite a new law that came into force late last year.

The legislation bans users under the age of 16 from accessing 10 major social media platforms. However, Australia’s eSafety regulator says it has “significant concerns” about how well companies such as Facebook, Instagram, Snapchat, TikTok, and YouTube are complying with the rules.

Australia introduced the ban to protect children from harmful content and addictive algorithms. The move is being closely monitored by other countries, including the UK, as governments worldwide explore stronger measures to safeguard young users online.

While companies such as Meta and Snap have criticized the policy, describing it as challenging to enforce, they maintain that they are making efforts to comply with the law.

In its first official report since the ban began in December, the regulator identified several poor compliance practices among major platforms. These included allowing children who previously declared themselves under 16 to later claim they were older, enabling repeated attempts to bypass age-verification systems, failing to prevent new under-16 users from creating accounts, and lacking effective reporting tools for parents to flag underage accounts.

Limited data has been released since the law took effect. In January, regulators reported that 4.7 million accounts had been restricted or removed during the first month following the ban’s introduction on December 10.

Australia’s eSafety Commissioner, Julie Inman Grant, expressed concern that some companies may not be taking sufficient action.

“While social media platforms have taken some initial steps, our compliance monitoring suggests that some may not be doing enough to meet Australian legal requirements,” she said.

The regulator, which had previously focused on monitoring the situation, has now announced plans to begin actively enforcing the law and gathering evidence of non-compliance.

Grant explained that enforcement would require proof that platforms failed to take reasonable steps to prevent children under 16 from maintaining accounts, rather than simply demonstrating that some children remain active on social media.

Social media companies have responded with mixed reactions. Meta, which owns Facebook, Instagram, WhatsApp, Messenger, and Threads, said it is committed to following Australia’s rules. The company also noted that accurately determining users’ ages remains a major industry challenge and suggested that stronger age verification at the app store level could offer a more effective solution.

Snap, the developer of Snapchat, reported that it had already locked 450,000 accounts and continues to remove more accounts daily.

Despite the ban receiving widespread attention, many under-16 users still appear to be accessing social media platforms, including Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Reddit, and streaming services such as Kick and Twitch.

During a recent visit to a school in Sydney, many students who had used social media before the ban reported they still had access. Some said they were never asked to verify their age, while others admitted finding ways around existing age checks.

One student claimed that out of 180 girls in her school year, only three had lost access to social media platforms.

Parents across Australia have largely supported the policy, with many saying it has strengthened their ability to refuse their children’s requests to join social media platforms.

However, critics—including technology experts and child wellbeing advocates—argue that education about online risks may be more effective than outright bans. Some also question whether the law can be realistically enforced and warn that it may disproportionately affect minority groups, including rural children, disabled teenagers, and LGBTQ+ youth, who often rely on online communities for support.

On Tuesday, Commissioner Grant described the reforms as an effort to reverse two decades of deeply rooted social media habits.

She emphasized that long-term cultural change will take time but insisted that social media platforms already have the technical ability to comply with the rules.

Grant also highlighted the role of parents in supporting the new measures, noting that many families feel empowered by the law when managing children’s online activity.

She added that major industry players may resist changes that affect their business models but reaffirmed the government’s commitment to moving forward with the policy.

Leave a Reply

Your email address will not be published. Required fields are marked *