Changes to online regulations in Australia will impact how users engage with social media and search engines. Starting on December 10, 2023, individuals under the age of 16 will be prohibited from creating accounts on nine popular social media platforms, including Facebook, Instagram, TikTok, Reddit, and Snapchat. This legislative measure is part of a broader initiative aimed at protecting young people from harmful online content.
Following closely behind this social media ban, search engines like Google and Bing will implement new age verification requirements beginning on December 27, 2023. These regulations mandate that search engines determine the age of users to filter out explicit content, including pornography, high-impact violence, and self-harm material. The eSafety Commissioner will oversee the implementation of these codes, which were developed in collaboration with major tech companies.
New Measures and Their Implications
The upcoming changes have garnered significant attention, particularly among teenagers, many of whom have expressed their discontent through a petition circulating on TikTok. According to Lisa Given, an information sciences professor at RMIT University, there is potential for public backlash once users realize the implications of these new requirements. “After the 27th of December, when people are waking up and realizing they need identification, they may start questioning these codes,” she said, suggesting that January could prompt discussions about privacy and digital rights.
Currently, Google dominates the Australian search market, accounting for approximately 90 percent of searches, with users needing to be at least 13 years old to create an account. Under the new codes, search engines must take extra steps to ensure that anyone under 18 is shielded from extreme content. If users are not logged in, these platforms must automatically blur certain images to protect younger audiences from potentially harmful material.
Unlike the United Kingdom, which introduced similar regulations earlier this year, Australia is focusing on search engine-level content filtering, rather than regulating individual adult content sites. Given emphasized the broader implications of this approach, stating, “Instead of targeting those sites, this really goes across a much broader swath, at the search engine level.”
Concerns Over Privacy and Data Collection
The new codes, developed by the Digital Industry Group (DIGI) alongside prominent tech firms such as Meta, Microsoft, Apple, and Google, represent a significant shift in online safety regulation. In a statement to a government committee, Jennifer Duxbury, DIGI’s regulatory affairs lead, noted, “We believe these codes represent a significant and tangible uplift in Australian online safety regulation for all Australians, including young people.”
However, consumer and human rights advocates are raising alarms about the potential consequences of these measures. Critics argue that the requirement for search engines to implement “age assurance measures” may lead to invasive practices such as facial recognition, credit card checks, or reliance on third-party age verification services. John Pane, chair of Electronic Frontiers Australia, warned of the privacy implications, stating that “an incremental measure of safety must not be purchased at the cost of the fundamental privacy, anonymity and freedom of expression for an entire generation.”
The ongoing debate highlights the delicate balance between protecting young users from harmful content and preserving individual privacy rights. As these measures roll out, the Australian public will likely continue to scrutinize their effectiveness and the potential risks involved.