Table of Contents
Starting last week on 6 March 2026 new face age checks for porn, adult games and AI chatbots came in with new online safety rules.
People need to prove their age to access pornography, R-rated video games and explicit artificial intelligence chatbots, under sweeping new online safety rules aimed at protecting children from harmful content.
The changes come under new industry codes overseen by the eSafety Commissioner and will apply to search engines, social media platforms, pornography websites, app stores, gaming providers and generative AI systems, including so-called companion chatbots.
These services must take meaningful steps to prevent children from accessing adult material or face penalties of up to $49.5 million per breach.
Australia's eSafety Commissioner Julie Inman Grant said the changes were designed to bring online spaces in line with long-standing rules that protect children in the physical world.
“We don’t allow children to walk into bars or bottle shops, adult stores or casinos,” she said.
“But when it comes to online spaces where they are spending a lot of their time, there have been no such safeguards. That changes for Australian kids with these codes.”
Early exposure to harmful content
The move follows a parliamentary inquiry into the impacts of pornography on young people, which heard the average age of first exposure to pornography in Australia is about 13 years old.
Evidence presented to the inquiry suggested children as young as six were showing signs of exposure to pornographic material.
Research by the eSafety Commissioner found one in three Australian children aged 10 to 17 has seen sexual images or videos online. More than 70 per cent have encountered harmful content, including extreme violence, pornography, self-harm material, suicide content and material promoting disordered eating.
Concerns about AI companion bots
The rules also address the rapid growth of AI companion chatbots, which allow users to hold conversations and build simulated relationships with artificial intelligence systems.
Authorities say some young Australians are forming relationships with these bots, which can sometimes become explicit or encourage harmful behaviour, including self-harm or suicide.
Under the new codes, any AI chatbot capable of generating sexually explicit, violent or self-harm related content must verify a user’s age before allowing access.
Stronger age verification required
The commissioner said simple age declarations such as clicking a button stating “I am 18 years or older” would no longer be considered adequate.
Platforms will be required to use “accurate, robust, fair and reliable” age-assurance systems.
Possible verification methods could include photo identification, facial age estimation technology, credit card checks, digital identity wallets or artificial intelligence systems that estimate age based on user data. Any method used must comply with Australian privacy laws.
While the rules tighten access for minors, adults will still be able to access legal adult content, provided they verify their age.
Inman Grant said the changes would also help relieve pressure on parents, teachers and carers trying to protect children from harmful material online.
“For decades society has recognised that there are certain things children are simply not physically, developmentally or emotionally equipped to deal with,” she said.
“Make no mistake, where we see failures or foot-dragging, we will hold companies to account.”

How the rules are changing
AI companion chatbots
AI companion chatbots capable of generating sexually explicit, high-impact violence or self-harm material must confirm a user is 18 or older before allowing access to that material.
App stores
App stores must take steps to prevent users under 18 from purchasing or downloading apps rated R18+ and ensure apps are appropriately classified. If the store does not already know a user’s age, they may be asked to confirm it through age-assurance measures.
Messaging services
There are no age checks required for widely used general messaging services attached to social media platforms, such as Facebook Messenger. However, users may be asked to verify their age on adult messaging services that specialise in distributing pornography, sexually explicit content or self-harm material.
Online gaming
Players must complete age assurance checks to access online games classified R18+ by the Australian Classification Board. For all other game classifications, no age checks are required.
Pornography websites
Users will need to confirm their age before accessing age-restricted material on pornography websites and services. Simply clicking a button that says “I am 18 years or older” will no longer be enough.
Search engines
For users not logged into an account, search engines will blur results containing pornography or high-impact violent content by default.
Social media platforms
Social media platforms that allow pornography or self-harm material must ensure users are 18 or older before accessing that content. This may involve age verification when logging in or when attempting to view restricted material. Platforms whose terms already prohibit pornography such as Facebook will see no change to current rules.