Australia’s eSafety Commissioner has signalled it will lean on Apple and Google to remove non-compliant apps from their stores if porn companies refuse to implement age verification — deploying a strategy it has already used to remove an app that facilitated the grooming of children.
The warning comes as new industry codes took effect on Monday requiring websites hosting pornography, violence, self-harm and disordered eating content to verify users are over 18 through methods such as facial age estimation, digital wallets and photo ID. Non-compliance carries penalties of up to $49.5 million per breach.
As this masthead reported on Friday Pornhub’s Canadian parent company Aylo began restricting Australian access to several of its websites last week in protest at the new rules – the same playbook it has deployed in the UK, France and 23 US states. Australian users attempting to access Pornhub are currently met with ‘safe for work’ content including skits and podcasts.
Getty Images
Apple and Google both officially prohibit pornography from their app stores. But the eSafety codes cover a much broader category of content and services than explicit porn. Dating apps, AI chatbots and random video chat apps are all distributed through the app stores and can expose minors to sexually explicit or harmful content.
An eSafety spokesperson said the regulator would take a “graduated approach to enforcement” against companies found to be in systemic non-compliance. They pointed to the interlocking design of Australia’s regulatory framework as a key advantage over comparable regimes overseas.
“Where Australia’s codes are unique compared to those that exist in other countries and jurisdictions like the UK, are that here there are multiple codes covering different sections of the online ecosystem,” the spokesperson told this masthead.
Under the framework, if an app providing access to sexually explicit material continued to expose children to that content and refused to comply, the regulator could contact tech giants Apple and Google to assess whether they were meeting their own obligations, under a separate App Store code, by continuing to distribute it.
How eSafety removed a grooming app
eSafety said it had already tested this approach. The regulator sent a compliance notice to the overseas-based operators of OmeTV, a chat roulette app that was pairing children and adults together for random video chats, enabling grooming by paedophiles. The notice was largely ignored.
eSafety then contacted Apple and Google directly, reminding them of their obligations under the App Store code not to distribute services that potentially enabled the grooming of children. Both companies subsequently removed the app from their Australian stores.
“The codes are designed to not just be a single code with a single point of failure,” the spokesperson said.
Apple and Google were contacted for comment.
AI chatbots in the crosshairs
In a series of interviews on Monday morning, eSafety Commissioner Julie Inman Grant said the codes were the first enforceable protections of their kind in the world targeting AI chatbots and companion apps.
Inman Grant said her office began hearing from school nurses around October 2024 that children in upper primary school were spending up to six hours a day on AI companion apps, which use emotional manipulation and what she described as “sycophancy” to keep young users engaged.
“We started hearing they were inciting young girls to engage in explicit sexual acts,” Inman Grant said. She pointed to a dozen lawsuits in the United States against chatbot providers, including a wrongful death case against Character AI after a young man was allegedly incited to take his own life.
“I don’t want that happening to Australian kids here,” she said. “So I went to the industry and said, you need to include this in the codes.”
Inman Grant said 63 per cent of teenage Australians had been exposed to violent pornography, including material depicting choking and strangulation.
She compared the new rules to long-standing real-world protections. “A kid can’t walk into a bar and order a drink. They can’t stroll into a strip club or sit down at a blackjack table in a casino,” she said. “This really just brings those protections to the digital realm.”
What will change for Australian users?
The codes extend to websites hosting pornography, social media platforms, AI chatbots, app stores and equipment providers. Simply clicking a button declaring a user is over 18 will no longer be sufficient.
Services must implement what eSafety deems “appropriate age assurance measures”. Inman Grant said the regulator was allowing companies to choose their verification method — including photo ID, facial age estimation, credit card checks or digital identity wallets — as long as it was “robust, fair and privacy preserving”.
What comes next?
The codes that took effect on Monday are just the start. Search engines including Google and Microsoft’s Bing must implement age assurance for logged-in Australian users by June 27, with unfiltered results for pornographic and violent content blurred or hidden by default for those not signed in. App stores must follow by September 9, though Apple has already begun checks for its Australian store.
The government is separately progressing digital duty of care legislation, which Communications Minister Anika Wells described as the next step beyond the “whack-a-mole” approach, placing a broader obligation on technology companies to proactively prevent harm rather than responding after the fact.
Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.
