
Social media age restrictions: eSafety Commissioner provides regulatory guidance

Australia's social media age restrictions will soon be in effect. From 10 December 2025, providers of social media platforms must take reasonable steps to prevent users under 16 years from having accounts – the "social media minimum age obligation". Earlier this year we looked at age assurance developments both in Australia and abroad. At that time, questions remained as to which social media platforms would be excluded and whether further clarity would be provided as to what steps providers need to take to comply. Recent developments provide further guidance.
Certain services are excluded
The social media minimum age obligation under the Online Safety Act 2021 (Cth) will apply to "age-restricted social media platforms". On 31 July 2025, the Minister for Communications made the Online Safety (Age-Restricted Social Media Platforms) Rules 2025 (Cth) which specify categories of electronic services that are not age-restricted social media platforms.
The exclusions are largely as expected, such as "services that have the sole or primary purpose of enabling end‑users to communicate by means of messaging, email, voice calling or video calling; and services that have the sole or primary purpose of enabling end‑users to play online games with other end‑users" (we've already examined the application of the social media minimum age obligation to online gaming). The Explanatory Statement notes that "the classes of services that have been specified in these Rules have been excluded from the social media minimum age obligation on the basis that they pose fewer harms to children and young people". However, the Explanatory Statement further outlines that the Rules are "not set and forget" and "allow the Minister to be responsive to technological evolutions and changes in the digital ecosystem".
As with most new regulatory requirements, we expect some complexity in the (at least initial) application of the Rules. This could be where services do not neatly fall within one of the categories, or the services have various purposes.
Service providers can use the eSafety Commissioner's online self-assessment tool to support their decision-making as to whether their services are age-restricted social media platforms. The tool steps through the various elements, and provides commentary on key concepts including what amounts to a sole, primary and significant purpose.
Guidance on taking "reasonable steps"
On 16 September 2025, the eSafety Commissioner released its Social Media Minimum Age Regulatory Guidance on how providers of age-restricted social media platforms can satisfy the social media minimum age obligation and associated requirements. When considering what amounts to "reasonable steps", the Guidance takes a principles-based approach and does not prescribe specific technical or other measures.
The Guidance emphasises that there is no one-size-fits-all approach for what constitutes taking reasonable steps. Rather, it will involve systems, technologies, people, processes, policies and communications that support compliance. It is also clear that a layered approach is needed, including taking reasonable steps to determine which accounts are held by age-restricted users and deactivate or remove those accounts, prevent age-restricted users from creating new accounts, and mitigate circumvention of measures. A review mechanism to reinstate accounts for users over 16 who are wrongly restricted should also be provided.
A central component of the Guidance is its "guiding principles". The reasonable steps taken by providers should be:
reliable, accurate, robust and effective (for example, determining acceptable error thresholds, mitigating known and reasonably foreseeable circumvention risks, and being able to demonstrate effectiveness);
privacy-preserving and data minimising (for example, compliance with applicable privacy obligations, and the expectation that providers will not retain personal information as a record of individual age checks);
accessible, inclusive and fair (for example, testing of age assurance methods in the Australian context including considering different demographics, mitigating the impact of accessibility and bias issues, and producing clear and readily understandable information about their age assurance methods);
transparent (for example, clearly outlining what age assurance options are being used or are available, and describing what personal information will be collected, used, stored and possible outcomes);
proportionate (for example, services with a higher risk profile are expected to employ more robust measures, but providers should avoid unreasonable practices that risk over-blocking access or infringing on an individual's rights); and
evidence-based and responsive to emerging technology and risk (for example, maintaining awareness of changes in circumvention methods, changes in end-user behaviour and demographics, being cognisant of privacy complaints and data breaches, as well as new developments in age assurance).
The Guidance also provides some insight into the eSafety Commissioner's intended approach to compliance monitoring and enforcement. Notably, the eSafety Commissioner has indicated that a strategic and, where appropriate, graduated approach will be taken to compliance and enforcement consistent with the eSafety Commissioner's Compliance and Enforcement Policy. The Guidance also acknowledges that providers vary, and that consideration will be given to the technical and commercial feasibility of measures adopted by providers.
Nonetheless, it is important to remember that, for body corporates, non-compliance with the social media minimum age obligation can attract civil penalties of up to 150,000 penalty units, which is currently equivalent to $49.5 million. This is on par with civil penalties for serious interferences with privacy under the Privacy Act 1988 (Cth), and for contravening the unfair contract terms obligations under the Australian Consumer Law.
Final phase 2 industry codes registered
In other online safety endeavours, there are now nine phase 2 industry codes which focus on preventing children from accessing or being exposed to age-inappropriate material online and providing end-users with information, tools and options to limit access and exposure to this material. On 9 September 2025, the eSafety Commissioner registered the remaining six phase 2 industry codes, which will come into effect on 9 March 2026. These apply to app distribution platforms, equipment providers, social media services (core features), social media services (messaging features), relevant electronic services and designated internet services.
Now is the time to get ready
With deadlines fast approaching, service providers should move quickly to ensure they can comply. First up, service providers need to evaluate whether their services are in or out of scope of the social media minimum age obligation. If the service is an age-restricted social media platform, then the service provider will need to implement layered age assurance systems which balance effectiveness, user privacy and proportionality, having regard to the Guidance. The service provider will also need to implement processes for effective account management, including deactivating underage users and having mechanisms to prevent re-registration. Service providers should also remain alert to other online safety endeavours, including compliance with the upcoming phase 2 industry codes.