Safer Internet Day Report

Safer Internet Day report: three quarters of UK children aged 10-12 have social media accounts

Yesterday was Safer Internet Day which saw thousands of organisations get involved to help promote the safe, responsible and positive use of digital technology for children and young people, co-ordinated by the UK Safer Internet Centre.

A report from the Centre shows that overwhelmingly young people want the internet to be a positive and inclusive place that respects people’s differences and they see their peers helping to create this. The centre’s online study of over 1,500 13-18-year-olds found that 94% believe no one should be targeted with online hate and 93% have seen their friends posting things online that are supportive, kind or positive about a certain group in the last year, for example, girls, LGBT people, disabled people, or those of a certain race or religion.

An estimated 2.1 million young people have done something online to show support to a certain group in the last year.

Sadly, it’s not all good news. The report also found:

  • More than four in five (82%) have witnessed online hate, having seen or heard offensive, mean or threatening behaviour targeted at or about someone based on their race, religion, disability, gender, sexual orientation or transgender identity.
  • Almost a quarter (24%) reported having been the target of online hate themselves in the last year because of their race, religion, sexual orientation, disability, gender or transgender identity.
  • As a result, some young people are self-censoring with nearly three quarters (74%) saying that online hate makes them more careful about what they share online.

While more than two thirds (68%) of those who had witnessed online hate in the last year say they know how to report it to a social network, in practice just a fifth (20%) actually reported it to the social network, app, game or website where they saw it.

The BBC also report that more than three-quarters of children aged 10 to 12 in the UK have social media accounts, even though they are below the age limit.

Most social media platforms have set 13 years as their cut-off point because of a US law called Coppa (Children’s Online Privacy Protection Act), which dates back to 1998. Whatsapp is an unusual exception in setting its age limit at 16 years.

Whilst the law mandated that online services would have to seek “verifiable parental consent” from younger users and would then be restricted as to how they could use the data, this is difficult in practice and different social media platforms have different policies, any of which can be easy to get round as proof of ID is rarely sought.

We all have a responsibility to help make the internet a safer place for children and other vulnerable people. Take a look at the UK Safer Internet Centre website for tips and advice as well as a helpline for parents and carers as well as professionals working with young people and a hotline for the public to report online child sexual abuse content.

Related training:

Internet Safety

Safeguarding and Child Protection

Anti-radicalisation and Prevent Awareness Training