UPDATE: Major tech giants including Snapchat, Facebook, Instagram, and TikTok have just confirmed compliance with Australia’s new social media ban targeting users under the age of 16. However, they’ve issued urgent warnings that these restrictions could push young people into “darker corners” of the internet.
As of December 10, children who do not meet the age requirement will be removed from these platforms, but exceptions will be made for educational and health services like WhatsApp and Meta’s Messenger Kids. This significant move has sparked a heated debate among tech executives during a recent parliamentary inquiry, where they expressed deep concerns about the implications of the ban.
TikTok’s public policy lead, Ella Woods-Joyce, stated, “We support evidence-based sensible legislation that improves safety standards for all internet users. A ban will push younger people into darker corners of the internet where rules, safety, tools, and protections don’t exist.” This stark warning highlights the potential risks associated with pushing teens away from established platforms.
Meanwhile, Jennifer Stout, representing Snap Inc, voiced similar concerns, arguing that the ban could diminish community trust in these new regulations. “For teens, connection and communication are strong drivers of happiness and well-being. Taking that away does not necessarily make them safer,” she said, cautioning that young users may turn to messaging services lacking in safety and privacy protections.
Mia Garlick, Meta’s regional director of policy, emphasized the complexities of enforcing the ban, noting that 16 is a “globally novel age boundary.” She highlighted the challenges technology faces in accurately distinguishing ages, particularly between 13 and 16, raising concerns about the reliability of age verification systems.
Failure to comply with the ban could result in hefty fines of up to $50 million for the companies involved. However, young users and their families will not face penalties for accessing the platforms.
In a significant development, Greens senator Sarah Hanson-Young threatened to summon executives from TikTok, Snapchat, and Meta during the inquiry after their previous absence. The proposed law places the responsibility on these companies to “detect and deactivate or remove” accounts belonging to underage users, which is expected to lead to the deactivation of approximately 1.5 million accounts across major platforms including Facebook, Instagram, YouTube, TikTok, Threads, and X within the next two months.
In a related issue, tech giants Apple and Google have removed the app OmeTV from their stores following alarming reports that predators were using it to groom and exploit Australian children. The app, which connects users with random strangers for video chats, was flagged by eSafety Commissioner Julie Inman Grant, who stated, “This is an app that randomly pairs young children with pedophiles. This app will no longer be able to reach Australians, and they will no longer be able to make money off children’s misery.”
As the December deadline approaches, scrutiny of tech companies will intensify, and parents will be watching closely to see how this ban unfolds and impacts their children’s online experiences. The future of social media engagement for teens hangs in the balance as the conversation about online safety continues to evolve. Stay tuned for further updates on this developing story.