top of page

Seven things to know about the upcoming social media ban.

  • Gov+AI
  • Dec 2, 2025
  • 4 min read

From 10 December 2025, under‑16s will be blocked from holding accounts on platforms like TikTok, Instagram, Snapchat, Facebook, YouTube, X, Reddit, Threads, Kick and Twitch. The new rules will (hopefully) significantly change how young people use big platforms in Australia from December 2025. Here are the seven things you need to know...


1.     It isn’t a “ban” from social media, but rather a delay in being able to open a personal account.

Importantly the ban is on social media algorithmic accounts. This means the change is mainly about stopping personalised, algorithmic feeds, posting, and messaging – the parts that drive doom‑scrolling and social pressure. However, they can still access most public content without logging in, such as searching for specific videos. While kids can watch educational content or clips like Mr Beast on YouTube without an account, they won't have personalised, recommendation-driven accounts that encourage continuous use.


2.     How accounts be turned off

Platforms (not parents or teens) will be required to “find and switch off” under‑16s’ accounts. The actual shut‑down will happen through a mix of technical steps and notices. Think of it as platforms gradually closing the gate, using their data to identify likely under‑16s, turning those accounts off, and then stopping new ones from being created. For existing under‑16 accounts, platforms are expected to notify affected users that their account is being deactivated because they appear to be under 16, and give a way to challenge the decision if they are actually old enough.

Once an account is deactivated, the young person will not be able to log in, post, or receive algorithmic recommendations, although some platforms may keep the data in a “frozen” state so it can potentially be reactivated when the user turns 16. Guidance to teenagers recommends downloading any photos, chats or content they want to keep before the cut‑off date, because there is no guarantee the account will be restorable later.


3.      How age will be checked

Platforms will be required to introduce “age assurance” systems, but they are not allowed to rely solely on government ID and cannot be forced to use a particular technology.  Instead, they are expected to use a layered mix of methods such as AI‑based age estimation, analysis of behaviour patterns, signals from connected accounts, and optional document or parental checks, with strict rules that any data collected for age checks must be ring‑fenced and destroyed once it is no longer needed.

 

4.     It won’t be perfect, but it will be something.

Kids will do their best to get around the new restrictions. Regulators have openly acknowledged not all under 16 accounts will be identified and deactivated. However, they have said they will expect regular evidence from companies such as reports on how many under‑age accounts have been deactivated and what systems are in place. Equally, from the start date, platforms must also stop new under‑16 accounts from being created by putting age‑assurance checks at sign‑up and when certain risk signals are triggered.

 

5.     Penalties for platforms, not kids

The legal obligation sits on the social media companies, not on children or parents, with the federal law allowing for very large civil penalties (in the tens of millions of dollars) where a platform systemically fails to take reasonable steps to keep under‑16s off.  Enforcement will be led by the eSafety Commissioner, who can investigate systemic non‑compliance and seek fines or other sanctions, rather than chasing individual families.


6.      There will be an easy option to “dob in” under‑age accounts

A practical feature of the framework is the expectation that platforms will provide clearer ways for users, parents, schools or others to report suspected under‑16 accounts so they can be reviewed and, if necessary, removed. This sits alongside internal detection tools – such as age‑estimation technology and behaviour monitoring – and is intended as another avenue for communities to flag accounts that slip through the automated systems. So if you are a parent and want your child’s accounts deactivated, you should be able to do this directly with the platform.


7.      You can’t give permission for them to keep their accounts.

You can tell your kids to stop lobbying you- your child cannot legally keep accounts, even with your permission!
 The law sets a blanket minimum age of 16 for accounts on the major social platforms; parents cannot “opt out” by giving consent.  Young people and parents are not fined if an under‑16 still manages to use social media, but any account is technically not allowed and can be removed.

 

*This blog was produced with assitance from AI. All sources have been verified.

 

Sources



Australia's social media ban for kids under 16 - how will it work? https://www.bbc.com/news/articles/cwyp9d3ddqyo

Final rules for social media ban revealed, with no ... - ABC News https://www.abc.net.au/news/2025-09-15/social-media-ban-final-rules-announced/105776730


Social Media Safety - YourSAy https://yoursay.sa.gov.au/social-media-safety

 
 
bottom of page