From December 10, anyone under the age will be knocked off the biggest social media services, but what if parents want their kids to have access?
The Australian government’s holiday gift for some parents is practically here. On December 10, the population of switched-on teens, pre-teens, and kids on their way to both of those demographics will be faced with an internet that isn’t quite ready for them, as the Social Media Minimum Age bill goes into effect.
Also known as the “SMMA”, the government’s social media ban for anyone under the age of 16 may well be something some parents love, but it isn’t something all parents support.
The government says not to call it a ban, but the Online Safety Amendment is clearly quite that, forcing some of the biggest social networks to take steps to remove people under a certain age group from accessing the platforms, thereby banning them.
Semantics only get you so far, and refusing a group of people access to something would be clearly defined as a ban.
When users turn 16, access will be returned, but before it, accounts under their name and with their ages are cut off, in regulation intended to improve mental health, that could also possibly do more harm than good.
There’s also the issue of parental rights, especially when it comes to raising kids on the internet.
Some parents may want the government to step in and help police things, while others might be fine with their kids embracing the online world, and talking them through it.
That’s a choice for parents to make, not the government.
Parents might understand their kids use social media and YouTube for communication, for education, for research and connection and because for some kids, that’s how they connect with the real world.
Since the bill was thrown about as an idea initially in one state followed by the federal government looking into it, the powers that be also said that parents won’t be penalised if they chose to ignore it, and parent their own way. While the amendment was essentially a rule and law, parents and kids wouldn’t be penalised for ignoring it altogether.
So what can parents do if they want to teach their kids how to use social, and allow them to keep using social media when the government says no and services follow?

It’s not a ban, but it’s still a ban
It’s been an awkward year for parents and kids, and the since the government declared war on social media.
Aspects aren’t without merit, but the Australian government’s approach to a social media ban on kids under the age of 16 isn’t accepted or agreed upon by every parent or child, and that makes for an interesting situation.
Not helping it is the feeling that the regulation has been rushed. While the technology isn’t quite where it needs to be for proper age verification and classification, the government is pushing on and forcing social media organisations to adapt.
From December 10, all major social media services will prevent access to accounts it believes are under the age of 16, using a combination of factors, such as declared age and signals. Any accounts started from that date will also be policed to some degree, but only on the affected services, a list that seemingly grows almost at the government’s whim.
As of the date this article was published, that includes:
- Kick
- Snapchat
- Threads
- TikTok
- Twitch
- X, and
- YouTube
Basically, from December 10, if you’re under the age of 16 and a member of these services, you won’t be anymore. Access will be revoked, though in some instances, it may take more time than simply one day.
Or to put it more succinctly, if your account says you’re under a certain age, you will lose access.

How long will it take to lose access?
Some services started removing accounts for under 16 before the December 10 date, while others will do it from that date.
Most services Pickr has spoken to have said that it could take days to weeks to sort through the sheer number of accounts.
We’re told it will happen, and social media services have a pretty good incentive as to why: they risk heavy penalties, with fines of up to $49.5 million per offence if it doesn’t take steps to prevent an underage account.
If social media services don’t take “reasonable steps” and try to cut off accounts for anyone under the age of 16, they could be hit with a staggering fine. So the incentive is clearly there for them to work within the government’s rules.
How does age verification work?
To deal with this, most will turn to age verification, using a combination of AI and images, government-issued IDs, or bank detail access to determine whether an account holder is over an age.
Some services will work, and others markedly less so.
Initial tests from a government report suggest none of the services delivered a 100 percent guarantee, and some flat out had problems working out whether those near the 16 cut-off were actually under it or over it.

However, every social media organisation Pickr spoke to said the same thing: the technology wasn’t there yet to guarantee an age determination.
While those under a certain age group are the main target, there will likely be false positives to the age verification issue, as well.
Already in the first few days of Meta running checks, several adult accounts had been disabled, affecting businesses and livelihoods. The technology will surely get better (it would have to), but we suspect these teething issues will keep occurring over the coming months.
At least one service will skip the age verification entirely, and just cut under 16s off.
YouTube’s access will largely disappear overnight to anyone under the age of 16, losing the ability to log in and watch videos, as well as to upload videos to their accounts. No one will see their videos, and they won’t be able to add new ones.
YouTube Kids will still be accessible because it’s an entirely different platform, but under 16s intending to use YouTube will either need to do so logged out or with mum and dad’s account. And that is clearly one of the ways kids will still be able to access YouTube. But what about the other social media platforms?

What if parents want their kids to have access?
One of the frustrating parts about the social media minimum age regulations is that while the burden of policing under age activity does fall to the social media service, it also prevents parents who don’t agree with it from allowing their kids access with their own accounts. It treads on parental rights.
Since the idea was first rolled out and subsequently rushed with not even three days of conversation from the public compared to weeks in other regulatory measures, the government has remained clear: neither parents nor kids would be penalised for ignoring them and not abiding.
If you read that as “parents could allow their kids to have social media accounts”, you wouldn’t be alone. The problem is from December 10, the major social media services won’t be allowed to let them.
For some parents, this could be a problem, particularly if they want to teach their kids how to use social media before they turn 16, as opposed to just simply throwing them out to the world and the wolves when they hit that age. It’s a problem if parents want their accounts linked to their kids, because adult accounts don’t do that.
And for the government, the suggestion that parents and kids won’t be penalised even though social media organisations won’t also actively allow the accounts is disingenuous.
It suggests that you can have social media, even if you can’t have the account, which is kind of like saying you can have something to eat, as long as you don’t actually swallow.
In a more geeky tone, you can languish in the fiery pits of Mordor, even if the fiery pits of Mordor are hazardous to your health. That sort of thing.
And that is roughly what the government said when Pickr asked how parents could help their kids use social if they wanted to.
“Age-restricted platforms won’t be allowed to let under-16s create or keep an account. These platforms may face penalties if they don’t take reasonable steps to prevent under-16s from having accounts on their platforms,” said an eSafety Commissioner spokesperson.
“However, there are no penalties for under-16s who manage to create or keep an account on an age-restricted social media platform, or for their parents or carers,” they said.
“These delays give parents, carers and educators extra time to teach under-16s about online risks and the impacts of harms, as well as how to stay safer online and seek help when they need it. This will give young people a better chance to deal with issues once they turn 16 and can have social media accounts.
“The delay gives young people space to develop the digital literacy, resilience and emotional maturity they need before stepping into complex online environments.”

Digital literacy and emotional maturity
The online world can be hugely complex, and it’s one that can take advantage of naivety, something that goes with youth. But the problem with emotional maturity and gullibility is that neither are limited to young people.
Anyone can fall for some of pitfalls and pains and problems associated with social media, and many adults do all the time.
The fact that so much money is lost to scammers via social engineering indicates that you don’t necessarily have to be young to be a victim to something on social media. Social engineering isn’t just for young people; anyone can be fooled.
However, the term digital literacy is pushed like it’ll be taught throughout the years through the government’s resources, rather than experienced and educated through a parent’s eyes alongside their kids.
“Digital literacy isn’t just about knowing how to use technology, it’s about understanding how to think critically, spot risks, and make good choices online,” said an eSafety Commissioner spokesperson to Pickr.
“That is why it is important to acknowledge that digital literacy and learning to safely navigate the internet goes well beyond just social media. eSafety’s resources assist parents, carers and educators to teach young people about how to stay safe online starts with our Early Years work for 0-5 year olds and then moves through age-appropriate resources so that young people can develop the skills they need before they turn 16.”
While it is entirely possible that kids and parents devoid of the temptation of social media will clamber around the government’s eSafety resources, it’s also more likely that few will really embrace anything like it, and these will go on being wasted.
What could have been the government working with social media organisations for better regulation and developing programs to help parents and kids alike has instead turned into a prohibition of sorts, and a somewhat half-hearted one, at that.
Social media will instead just cut the offending accounts, and prevent any of the existing parental controls from doing their job.

Some parents may say the Social Media Minimum Age rules are a win for society, and members of the government sure will, but without proper development and consultation with the very people it is meant to protect, it could make a lot of things worse quickly.
“The Government’s plan to ban social media use for under 16s may be well-intentioned but in practice risks unintended consequences,” said Rachel Lord, Public Policy Senior Manager for Google and YouTube Australia in a statement in October.
“The legislation will not only be extremely difficult to enforce, it also does not fulfill its promise of making kids safer online,” she said.
“Well crafted legislation can be an effective tool to build on industry efforts to keep children and teens safer online. But the solution to keeping kids safer online is not stopping them from being online, it’s about making sure platforms have relevant guardrails in place and empowering parents with the tools and confidence they need to guide their children’s online experiences.”
Parental options and the actual regulations
We wouldn’t be shocked if parents were looking for options for their kids ASAP, some of which are incredibly direct.
Frustratingly, no social media company would talk on record about what parents could do, and what their options ultimately were.
So we jumped into pages and pages of regulatory guidance provided by the government, much of which says what will be happening and what’s required by social media services, but not specifically what parents can do.
We perused the resources and scanned what we could find, looking for a sign that politicians had clearly talked about it and thought this thing through.
After all, the government was pretty clear that parents wouldn’t be penalised. Did the amendment say anything about that?
It turns out the answer is no.
Pages and pages of documentation suggests the onus of responsibility for policing social media ages lies entirely with the services, and that following on from the date it begins, social media services would have to police it.
But it doesn’t say anything about what parents can do, and so the assumption is pretty clear: start a new account if you need to, or keep an account whatever way you can.
The government even said it clearly in an aforementioned statement to Pickr:
…there are no penalties for under-16s who manage to create or keep an account on an age-restricted social media platform, or for their parents or carers.
If you can manage to keep an account despite a social media organisation’s best efforts, Australian parents and kids will not be penalised.
And if the social network also has done its due diligence to check or verify — which it’s also not required to do on accounts clearly over the age of 16 — it should be fine, as well.
That will likely happen to accounts sending signals that they’re young before December 10, but it might not happen after. And keep in mind: not every service relies on full names at registration. They might simply rely on usernames, handles which ultimately mean very little.

What can parents of kids under the age of 16 do?
If you’re a parent of a child who has a way to go before they hit 16, and you support your kids being on social, on YouTube, or maybe even like the idea of teaching them how to use social media and other digital services before they hit the cut-off age that is 16, you are clearly in for a time.
But there are some solutions that could be staring you in the face. Almost quite literally.
Work with your kids
One of the rather obvious solutions could see parents work with their kids to make sure an account stays active, or that a new account exists in its place without the same obvious age signal.
Previously, kids were supposed to be a minimum of 13 years old to start most social media accounts, but you can guess how any under the age managed to get around that requirement.
You don’t need a technology journalist to tell you what that means — your kids will do that for you when they ask you to help them verify the account — but you can also use this as the ideal time to talk to your kids about social, and to have them come to you when anything sounds wrong or concerning.
Parents can help fill in the space of digital naivety, and may even find it’s helpful to be vulnerable and say that even they can fall victim to it. That’s why it’s so important to be honest and truthful, and work together.
Now that social media won’t allow accounts under 16, digital parental connections on the major services are no longer allowed, so if you did have an account set up to monitor and talk to your kids, you will need to turn to the old fashioned way, and they with you.
Consider an alternate social media service
A slightly less obvious approach, however, could be to turn to one of the many smaller social media services not being targeted by the Australian government. They are just that — smaller — and so your friends and celebrities and other people might not be on it.
At the moment, that could include the likes of Bluesky, Mastodon, Discord, Messenger, Steam Chat, WhatsApp, YouTube Kids, Roblox, and others. Kids and teens might even find smaller services that are less known than these.
If you’re a parent and you want to work with them on social media, do some research, start an account, and talk about their options with them. It might just help them to trust you on a topic they find important.
It’s worth keeping in mind that the rules may change quickly, leading some services to be proactive about age verification, something Roblox did before the act went into effect.
Create a new social network
Finally, there’s probably never been a better form of encouragement than having someone say you can’t do something, so one final option might be to embrace the “learn to code” movement and make your own social media service.
Like all apps, we don’t expect this one to be easy, but between the razor sharp minds of young people, and possibly some push from parents, not to mention a bit of AI assistance, some crafty pre-teens and kids can come up with a service for their demographic: kids and teens in Australia affected by an age ban they might heavily disagree with.
If that happens, it might just change the world, starting with the very people it was intended to affect in the first place.