Two months after the introduction of Australia’s Social Media Minimum Age (SMMA) law, Snapchat has outlined its ongoing efforts to comply with the regulation and highlighted challenges in its current implementation. The law aims to enhance online safety for young Australians by restricting access to social media platforms for users under 16.
Snapchat reported that by the end of January 2026, it had locked or disabled over 415,000 accounts in Australia associated with users who either declared themselves as under 16 or were identified as likely underage through age detection technology. The company stated it continues to remove more accounts daily.
However, Snapchat expressed concerns about technical limitations in accurately verifying users’ ages. The company referenced a government trial from 2025 which found that existing age estimation technologies are accurate only within two to three years on average. This margin of error could result in some minors bypassing restrictions while eligible users might lose access inadvertently.
Snapchat also noted that the SMMA does not cover all digital platforms equally. According to the company, “the current approach lacks industry wide protections, leaving vulnerabilities with hundreds of other apps that are either not in-scope of this law or where it is unclear.” They warned that young people may migrate to less regulated messaging services if blocked from mainstream apps like Snapchat, potentially exposing them to greater risks.
To address these gaps, Snapchat advocated for app store-level age verification systems. The company stated: “App store-level age verification would help address multiple risks and gaps. First, it would give in-scope apps more consistent age signals for each device… Second, it would strengthen safety across the entire digital ecosystem — not just for select regulated apps, but for all services.”
Snapchat maintained its opposition to an outright ban on those under 16 using their service: “We want to be clear: we still don’t believe an outright ban for those under 16 is the right approach… we do not believe that cutting teens off from these relationships makes them safer, happier, or otherwise better off.”
Despite disagreements with aspects of the policy, Snapchat emphasized its commitment to engaging constructively with policymakers and suggested improvements such as centralized verification at the app-store level.
The company also detailed ongoing safety initiatives including requiring mutual connections for one-to-one communication and maintaining round-the-clock Trust & Safety teams—including staff based in Sydney. Additionally, Snapchat has expanded parental controls via its Family Center feature, allowing parents increased visibility into their teens’ usage patterns and contacts.
“We continue building safety protections that will keep young Snapchatters safe in Australia and around the world,” a spokesperson said.



