When Australia went in and out of lockdown at the start of the COVID-19 crisis, the use of many social networking apps soared.
However, it is important to remember that apps such as Facebook, Instagram, WhatsApp, Snapchat and TikTok all have suggested age limits and aren’t always suitable for younger children. Most require all users to be at least 13 years old. Teens’ and tweens’ brains are still developing rapidly and have a tendency towards risk, meaning parents need to understand the impact that apps may have — good or bad.
In fact, a new survey by one of Australia’s largest telcos has revealed that 45% of parents do not have existing measures in place to protect their children online.
More than 72% of children in Australia have access to two or more devices in a typical day, making that stat even more concerning — especially when we’re using our devices more than ever.
Who to trust?
In 2019, Business Insider Intelligence rated various social media channels on their trustworthiness, using pillars such as security, legitimacy, and community to decide how trustworthy people believed each platform to be.
Facebook performed worse across every pillar. On the security pillar, more than four in five (81%) of respondents who were also Facebook users said they were only slightly or not at all confident in the platform to protect their privacy and data.
Houseparty is another app that’s skyrocketed in popularity during lockdown. One source estimates the app was downloaded 2 million times in the week commencing 16 March 2020, compared to just 130,000 weekly downloads one month prior.
So how safe is Houseparty? You may have heard rumours circulating around the app’s digital security, a claim which the developer has strenuously denied. Even if the app is as secure as other apps on the market, there are still some things parents should be aware of.
Firstly, as far as video chatting apps go, Houseparty is pretty open. Friends can communicate with each other through live video and texts in chat groups. There’s no screening and the video is live, so there’s nothing to keep kids from seeing inappropriate content.
Users can send links via chat, take screenshots and there’s also nothing keeping friends of friends joining groups where they may only know one person — just like a real house party.
TikTok is another app that’s seen a serious boost in popularity during lockdown. TikTok is an app for creating and sharing short videos between three to 60 seconds long. It encourages users to express themselves creatively through music, special effects, and voice overs.
The minimum age to use the app is 13, but there isn’t a real way to validate age, meaning that anyone with an email address can download and use the app. Parents also express concern that there is a lot of inappropriate language in the videos, meaning it’s often not appropriate for young children.
All TikTok accounts are public by default, meaning anybody can view the videos uploaded by your children and get in touch with them.
Facebook’s Messenger Kids now has more than 7 million monthly active accounts as growing peer-group pressure encourages parents to sign their kids up. They do this without fully understanding what information is being collected about their children by Facebook, nor the ramifications of what Facebook will ultimately do with this information.
These kinds of apps are not necessarily ‘kid safe’ when it comes to data protection. Facebook has already breached tens of millions of Facebook users’ data via Cambridge Analytica and now parents are blindly giving their kid’s personal data away, including shared photos.
In the classroom and beyond
When it comes to data privacy and security, it’s not simply about what apps you’re using as a parent, but what your child’s school is using too – considering it’s the place where they spend a majority of their time while growing up.
We see schools taking an average of 35,000+ photos per year without a manageable way to let parents choose how photos of their children are used, risking image-based abuse or emotional trauma.
One of the biggest threats today is the use of a child’s image. An image is the one piece of data that most easily shares a mountain of information to identify a person, including their facial identity, age, interests, religion, location, and even tells strangers what school they go to and when you are not at home.
Images and identifying details can be used for numerous activities that put your child at risk, from paedophilia and stalking, to cyberbullying, identity theft, self-harm and even digital and physical kidnapping.
Your school should be seeking your consent every year and adhering to any privacy requests you might have. Speak up if teachers aren’t deleting photos off their personal devices or are posting these photos in public places like an online social media channel or blog.
Whether your children are learning from home or remaining in the classroom, it’s important to have full knowledge of the apps and platforms they come into contact with. Parents should feel in control over how their child’s personal information is used and shared, and how much of the global online world of strangers they are happy for them to be exposed to.
Be empowered to reach out to your school and ask them about the privacy measures they’ve got in place, and make sure they’re up to scratch – especially during a crisis. I started to do this recently with my daughter’s school and have found it to be a very collaborative process. If you don’t tell your school you have a problem, they won’t know this issue must be top of mind.
Finally, put together a plan with your child about their boundaries. Consider making a social media agreement with your kids up to a certain age for them to agree not to bully and to tell you if they’re being bullied, to not give out personal information including photos that easily identify them, and to avoid sharing inappropriate content.
As we all begin to slowly venture back out into the outside world, taking precautions around our children’s digital security should remain high on the agenda.