11 March, 2026
apple-and-google-face-pressure-to-remove-non-compliant-apps

Australia’s eSafety Commissioner has indicated that both Apple and Google may be compelled to remove non-compliant applications from their stores if pornography companies fail to implement necessary age verification measures. This warning follows the introduction of new industry codes that took effect on March 9, 2026, requiring websites hosting content related to pornography, violence, self-harm, and disordered eating to ensure users are over the age of 18. Non-compliance could result in penalties of up to $49.5 million per infraction.

The regulatory changes have arisen as a response to increasing concerns regarding children’s exposure to explicit content. In a noteworthy development, Pornhub’s Canadian parent company, Aylo, began restricting access to several of its websites for Australian users last week, citing objections to the new rules. This method of protest has been previously employed by the company in the UK, France, and 23 states in the United States. Users attempting to access Pornhub in Australia are currently redirected to “safe for work” content, which includes skits and podcasts.

While both Apple and Google have official policies against the distribution of pornography in their app stores, the eSafety codes encompass a wider range of content and services. This includes dating apps, AI chatbots, and random video chat applications, which can potentially expose minors to sexually explicit or harmful material. An eSafety spokesperson noted that the regulator plans to adopt a “graduated approach to enforcement” for companies found to be in systemic non-compliance.

The eSafety Commissioner, Julie Inman Grant, elaborated on the unique nature of Australia’s regulatory framework compared to those in other nations. “Where Australia’s codes are unique compared to those that exist in other countries and jurisdictions like the UK, are that here there are multiple codes covering different sections of the online ecosystem,” she remarked. This framework allows the eSafety regulator to take action if an app continues to expose children to harmful content and fails to comply with regulations.

In a recent enforcement action, the eSafety office issued a compliance notice to the operators of OmeTV, a chat roulette app that had been pairing children with adults for random video chats, creating opportunities for grooming. When this notice was largely ignored, the regulator contacted Apple and Google, reminding them of their obligations under the App Store guidelines. Subsequently, both companies removed the app from their Australian stores.

Inman Grant emphasized the intent behind the new codes, stating, “The codes are designed to not just be a single code with a single point of failure.” She pointed out that the primary aim is to protect children from potential harm online.

The introduction of these codes marks a significant advancement in digital safety measures. Inman Grant revealed that her office had been receiving alarming reports from school nurses since October 2024, highlighting that many children in upper primary school were spending up to six hours daily on AI companion applications. These apps reportedly exploit emotional manipulation techniques to keep young users engaged, with some even encouraging explicit sexual behavior among minors.

Statistics from Inman Grant’s office indicate that 63 percent of Australian teenagers have encountered violent pornography, including content depicting choking and strangulation. She compared the newly established online protections to existing offline regulations, stating, “A kid can’t walk into a bar and order a drink. They can’t stroll into a strip club or sit down at a blackjack table in a casino. This really just brings those protections to the digital realm.”

The new regulations not only target websites hosting pornography but also extend to social media platforms, AI chatbots, app stores, and equipment providers. Companies are now required to implement what eSafety defines as “appropriate age assurance measures.” This means that simply clicking a button to declare one’s age is no longer adequate. Verification methods must include robust options such as photo ID, facial age estimation, credit card checks, or digital identity wallets, all while preserving privacy.

Additional deadlines are set for search engines like Google and Microsoft’s Bing, which must implement age assurance for logged-in Australian users by June 27, 2026. Unfiltered results for pornographic and violent content will be blurred or hidden by default for users who are not signed in. App stores are expected to comply by September 9, 2026, although Apple has already begun conducting checks for its Australian store.

The Australian government is also advancing legislation for a digital duty of care, which Communications Minister Anika Wells describes as a proactive measure aimed at preventing harm rather than merely reacting to incidents. These ongoing efforts reflect a broader commitment to ensuring the safety of minors in the rapidly evolving digital landscape.