My Blog
Technology

Social Media Still isn’t a Safe Space For Children. A Crackdown is Underway


Children’s online safety is in drastic need of a reset — not only in the eyes of many parents, but also governments, regulators and young people who have had negative experiences while using social media.

In the US, President Biden has called for stricter curbs on social media companies and the Kids Online Safety Bill has been gaining traction in the Senate. But the US is still a step behind the UK, which announced new draft rules on Wednesday that will see tech companies forced to take new steps to keep children safe, or risk fines or bans. Regulator Ofcom outlined 40 practical steps under the Online Safety Act that social media companies will have to follow to keep kids safe on their platforms.

There are two aspects to the new rules. Firstly, they will require tech companies who may have children using their platforms to enforce more stringent age verification to prevent young people from accessing age-inappropriate content. Secondly, they must improve moderation and “tame toxic algorithms” by filtering out or downgrading content including pornography, or references to self-harm, suicide and eating disorders.

Tech companies that fail to abide by the rules will face fines or even jail time for executives. While they will only apply in the UK, Ofcom hopes the severity of the sanctions will make the people on the boards and in the executive suites of the biggest social media platforms prioritize keeping children safe more resolutely than they have done in the past. If that’s the case, any changes introduced by the platforms could have far-reaching effects that impact young social media users across the world.

Social media platforms have long faced accusations that they’re contributing to mental health problems by failing to protect children from encountering harmful content. In some cases, including the death of the British teenager Molly Russell, this has had tragic consequences. Russell died by suicide in November 2017 at the age of 14 after viewing extensive material relating to self-harm on Instagram and Pinterest. The coroner in her case concluded that the content Russell viewed was responsible for her death and recommended that social media sites introduce stricter provisions for protecting children.

Over the years, tech companies have stepped up their moderation efforts and introduced new features, tools and educational resources to keep kids safe. But as many children and parents know all too well, many issues remain. Children are still exposed to content that’s harmful to their wellbeing and are at risk of exploitation from adults using the same platforms as them. It’s not only the adults that have had enough.

In designing the new UK rules, Ofcom consulted with over 15,000 children said the regulator’s Group Director for Online Safety Gill Whitehead. “When we speak to children and you ask them, ‘what do you want to change about the social media service?’ They tell you — they’ve got lots of ideas,” she said. It’s feedback from children that has informed rules like having the power to automatically decline group chat invites and being able to say they want to see less of specific types of content in their feeds.

Following a consultation, Ofcom expects to publish its new child safety code within a year. Under the new rules, tech companies will then have three months to assess the risk they pose to children and report it publicly. They’ll also have to be transparent about the steps they are taking to mitigate that risk.

As the rules come into force, Ofcom plans to continue consulting with young people, not only to see if measures introduced by tech companies are effective, but to identify new threats as they arise. Already, the regulator is planning a further consultation around the threats generative AI may pose to children.

“We’ve set out a number of messages today about how those reporting channels to tech firms must be improved,” said Whitehead. “But ultimately, if the feedback comes when we’re speaking to children that those reporting channels are ineffective, then that will be part of the plan of action that we have with those largest and riskiest firms as to how they’re going to deal with that.”



Related posts

Lululemon Takes On Hybrid Workouts With New Fitness Membership

newsconquest

Why Apple may finally be embracing touchscreen laptops

newsconquest

Taylor Swift Exclusive Eras Tour Clip Drops a Day Early on Apple’s App Store

newsconquest

Leave a Comment