OFCOM’s getting new powers to legislate social media - here’s what they need to do

We can all agree on one thing. Social media needs to be better for us.

For all of the benefits of these tools, they have fundamentally changed common society when it comes to the perpetration of harmful content, disinformation and how we see the world around us.

It’s a melting pot of mental health issues and fake news, and our governmental bodies have not done enough to keep up with it. Our guidelines when it comes to TV (a medium with an ever-reducing audience) are far more strict than social media - the primary destination for a lot of people, and their content wants and needs.

That doesn’t mean the likes of Facebook have not been doing anything about it - they have teams looking through this content and taking down anything harmful (much to the detriment of their own mental health), and they’re trying to automate the whole process with AI algorithms.

But I don’t know... It just seems a little disingenuous to me. In the same vein as the gambling industry (or the smoking industry a few decades ago), this is self-regulation purely for the reason to try and stop government legislation. The intention, no matter how heartfelt, has been to try and keep it in house, keep the costs down by automating, and stop them knocking on the door, to tell them to manually sort this. It’s obfuscation of the inner-workings are not good, especially when it comes to private companies that are literally changing the ways we interact with one another.

That is why, to the credit of this government, I’m happy to see an announcement that OFCOM is going to get more legislative powers over social media companies in the UK. Great start!

Are they ready for the challenge? ...No, not even a little bit.

As I said before, they’re starting on a rather archaic back foot, but that doesn’t mean it can’t be done. It’s just got to be radical and rapid.

Let’s take the Germany approach

Our European friends have a great policy around this - if a social network has over 2 million users, they have to take down a harmful piece of content within 24 hours, or be fined up to 5 million Euros for each infraction.

See? Radical and rapid. The way to take control of this situation is not going to be through soft regulation. It will be through tough legislation like this, that either forces the companies to take it seriously or get the help they need to get things back in line.

Guy Cocker, technology expert and a friend who gave me my first big break into gadget news, kind of hits the nail on the head here. 

Use this. Enshrine it in law. That’s a damn good place to start.

Better mental health support for the workers

Let’s be honest - AI is not going to completely solve this problem. We’d love it to, but people will always work their way around the typical parameters of what that machine will look for in harmful content. And while it will learn and improve, humans will continue to stay one step ahead.

That’s why you need people working on this, but they will quickly become desensitised to this stuff and suffer from mental health issues of their own like depression.

It’s one of those issues where there is no clear solution to it - this isn’t black and white and you’ve got a whole lot of grey to get through. That’s why our only real solution will be privately funded mental healthcare (emphasis on “private,” because it should be up to social media companies to cover this, not the NHS). 

Regular therapy sessions, providing a feedback loop for studies, where experts can plot a map of the psychosis of one of these workers, to better improve the support they give long into the future.

Because while that desensitisation will happen, that doesn’t mean we can’t be ready for it.

Use it, don’t abuse it

One worry I can’t get out of my head while I write this is what happens if the powers get used for other things? What happens if, say, a government begins to take totalitarian control and manipulate public opinion through these tools?

And it’s not just me, as newspapers have come out warning about this too.

The simple answer is to keep OFCOM independent, and I hope they can make it. This problem is going to take a lot of investment and from what I’ve seen of not just our government but governments across the globe, they may start looking for something in personal return.

Jason England

I am the freelance tech/gaming journalist, lover of dogs and pizza enthusiast. You can follow me on Twitter @MrJasonEngland.

http://stuff.tv/team/jason-england
Previous
Previous

What the hell is a “Pro” smartphone?

Next
Next

Yes, old man. I'm spending a lot of time on my phone, but who the hell asked for your opinion?