SPEND any time around children and the first thing you will notice is that they spend a huge amount of time on social media – no matter how young.
And make no mistake, despite supposed age restrictions, these apps are marketed at kids.
The viral challenges, selfie filters and addictive games are the internet equivalent of alcopops – designed to pull in young people without explicitly saying so.
However, as organisations like Barnardo’s face a record number of concerned calls about the online safety of young people, it’s time to demand more.
For too long, social media sites have been happy to lure kids in without protecting them from serious abuse, grooming and pornography.
Kids will be frightened
What we know about abuse and exploitation is that abusers’ coercive behaviour and the huge imbalance power means that most children will not be able to understand they’ve been abused until often long after the abuse has ended.
Abusers may tell them that no-one will believe them or that people will think it’s their own fault because they lied about their age on a platform that isn’t for them.
They may be frightened about their parents’ reactions because they broke the rules. But breaking the rules and testing boundaries is often part of being a child growing up.
Children will often believe what abusers say and start to feel like it’s their own fault. They feel that they won’t be believed or that people will blame them instead of blaming the adults that have abused and exploited them.
That’s how we end up with a situation where the platform isn’t safe – we put age restrictions on it – but we market it to children who then get hurt, but can’t speak up.
Children’s videos liked by adult men
Working with providers, I’ve seen first-hand how much more needs to be done.
The Government’s online harms white paper is a positive first step – but the failure to implement one of its first solutions, age verification on porn sites, shows the challenge we face to balance privacy and safety.
Some social media providers are trying to do things differently. For example, one company contacted us about an issue they were regularly facing: what to do about videos posted by children that had concerning numbers of likes.
When working with this company we came across a video of a young teenage girl, around 13 years old. She didn’t do anything in the video except look directly at the camera, and yet it had over one million likes.
That particular provider proactively looked at the profile of users who were watching and liking that video, and it was mainly adult men.
They wanted to make sure that this child was safe and give her information around what she could do if she needed help.
Global industry creates problems
That’s the kind of proactive activity that some platforms are keen to undertake and there is technology that enables them to do that.
The online harms paper should help provide the legal structure to support this type of safeguarding activity.
The problem is that the internet is a global industry, so although Britain has taken a really strong step, a lot of online platforms are operating outside of UK regulation.
So how do we make them comply with UK law when they operate across a global setting?
The internet has also changed how abusers operate and made it much easier to get access to a much wider group of victims, without the same level of risk.
We see online adult offenders casting their nets large and wide. They often don’t even employ sophisticated grooming tactics – some will immediately ask for illicit pictures from children or talk about having sex.
Teens have poor self-identity and self-esteem
Many children have the internal resource, the self-esteem and confidence and a protective wider network to say ‘That’s not okay, I’m going to tell my mum’ – but others, often those who do not have that same protection, will comply.
It can be hard for adults to understand why children can be coerced in this way, but adolescence is a difficult time.
Children often have poor self-identity and self-esteem, they may feel that they don’t fit in or have lots of negative things happening in their lives. This makes them more susceptible to being flattered or responding to someone telling them they are special or beautiful.
Children are also being raised in sexualised environments where they are surrounded by sexual imagery, language, fashion and behaviours. If the sexualisation of children is normalised then children are going to be much less resistant or shocked when abusers approach them in sexual ways.
Adults are exploiting children in environments where they should be safe to explore, play and be educated.
Quite simply, we need to lock those doors and gateways to stop giving abusers access to our children.
What we want is for the industry to be brought to account and to design their platforms with a safety-first approach – just like cars and other products are expected to be designed with safety at their core.
It is not good enough for the industry to make small changes after children have been exploited.
At Barnardo’s, we know what the potential dangers are, and working together with the Government, the voluntary sector, children and young people and the online industry, we could come up with solid solutions that would create a safe space for children without compromising the marketability or usability of those platforms.
For the sake of our children, it’s time we demanded more from the apps increasingly shaping our lives.