Governments worldwide are moving from voluntary guidelines to enforceable social media age limits, with over 40 countries introducing, proposing, or reviewing legislation, according to safety tech firm Privately SA. The momentum follows Australia’s landmark ban, which has now completed its first full month of enforcement.
Australia’s Online Safety Amendment Act 2024, effective December 10, 2025, prohibits children under 16 from holding social media accounts. Within weeks, platforms including TikTok, Instagram, YouTube, and Snapchat removed approximately 4.7 million accounts nationwide. Meta alone deactivated nearly 550,000 accounts across Instagram, Facebook, and Threads.
Platforms face fines of up to $49.5 million AUD ($33 million USD) if they fail to take “reasonable steps” to block underage users. Prime Minister Anthony Albanese said on January 16, 2026, “It is something that is a source of Australian pride… [the ban] is working and being replicated now around the world.”
Other countries are following suit. Malaysia’s Online Safety Act, targeting children under 16, took effect on January 1, 2026. Denmark, France, Turkey, and the European Union are also drafting or proposing age-restriction laws, with minimum ages ranging from 13 to 16.
The tech debate is shifting from whether to verify age to how to do it safely. Many platforms are adopting on-device Facial Age Estimation (FAE), which checks age markers locally without sending images to the cloud, protecting user privacy.
However, challenges remain. Reports suggest many Australian teens have bypassed the system using VPNs, fake birthdates, or older siblings’ accounts, raising questions about enforcement effectiveness.
Australia’s eSafety Commissioner is expected to release an independent evaluation in February 2026, analyzing the law’s impact on mental health, compliance, and circumvention. Observers say the findings could shape social media regulation globally.