6 Apr 2023

Do We Really Have Free Will Online?

Damian Collins MP talks cat videos, Cambridge Analytica and why we all need to care more about online privacy

Damian Collins, Chair of the Joint Committee on the Draft Online Safety Bill, is using a curious analogy to explain why more regulation is needed to protect us online; cats.

“If you really like, say, cats – then you’re going to see cats” he says of the internet experience of a typical cat lover. “When you go on YouTube, 70% of what people watch is presented to them by the platform. It’s recommended to them. So, if you like cats, you’ll see a lot of cats”.

But what if you were being told lies about cats? What if this algorithm were deliberately hijacked and used to convince you that cats were not benign pets, but ruthless trained criminals intent on bringing down humanity?

As absurd as this sounds, this ridiculous example is why Collins believes we need to do more to protect everyone – but particularly those who are vulnerable – more online. And why just because we might not have experienced any negative impact from this ecosystem, there are plenty that do.

Improving data regulation

Collins is the man tasked with improving data regulation in the UK. Sitting in his office in Westminster, the Conversative MP for Folkstone and Hythe is fighting the war on data – a journey that, he confesses, has led him to stop using certain apps and grow obsessive over his phone’s privacy settings.

To understand where we are now, Collins points to a catalyst event in 2018.

“I think the Cambridge Analytica scandal four years ago is the moment we started to understand this world a lot better” he explains, “we began asking questions about exactly what data is gathered, how it’s used, and who oversees this”.

The scandal lifted the lid on the murkier side of ‘Big Tech’ as it was exposed that Cambridge Analytica had used a quiz app on Facebook to collect data about anyone who used it.

“It showed people that the amount of data gathered about them online is far more than they thought. A personality trait app was used to scrape Facebook data about anyone who used the app – and that of their friends – without anyone’s knowledge” he explains.

“The second thing it demonstrated is that that data can end up in the hands of the sort of people you would never have given it to. And they can use it for things you would never have consented to.”

This is where things get more complicated – and, if we’re honest, many of us lose interest. The truth is we are being unknowingly manipulated. Our experience online is not one of free will – it’s one that a handful of powerful companies are deciding on our behalf. What we see online is decided by algorithms based on information gleaned about us from observing our habits.

Does it matter? That depends on who you are, according to Collins.

“Your experience of the internet – if you are using social media apps – is that you are shown content you are interested in. And you are shown that content because it wants to keep you on the platform for as long as possible and make you return as often as possible” he explains.

Back to his cat example, if you love cats, you’re going to be served up a lot of cats. For many of us, this is not an issue. Catophiles get their dopamine hit of cute kittens, fitness enthusiasts see some more work out tips, and everyone’s day is a bit brighter.

The vulnerability factor

But what if your interest is in extremist politics. Or conspiracy theories. What if you’re a vulnerable teenage girl who has had problems with depression and has begun looking at self-harm content? Then you will likely be served more of this content – often with inadequate checks and balances.

“People who are vulnerable are, in a way, the most likely to be targeted with content – and this makes them even more vulnerable” explains Collins “And it’s only made possible by the way companies gather data and use that data to profile people”.

Collins cites the Covid-19 pandemic as one example.

“We saw this during the pandemic with examples of the wellness community being targeted with anti-vaccine content. For some people this results in a journey of radicalisation. And there’s nothing you can do to stop it”.

And here lies the nub of the issue. We are outraged when we find something reported on the news is untrue – yet every day we engage with our own personal news service that contains none of the checks and balances the media is required by law to use. And we never consented to this.

This might not matter to you – but it’s a society-wide problem that causes unnecessary harm to vulnerable individuals. And for some this is worse than others – and we should worry about this.

Putting it in context, Collins explains:

“If you take a company like Facebook with 2.8 billion active monthly users and, say 10% of those people have a really horrible time, because of the way data is used to target them with stuff they don’t want to see… that’s 280 million people that are having a rubbish time every month”.

It is not, Collins is swift to point out, that the platforms as a whole are damaging.

“The net effect might suggest that the platform is very positive, but you can still have massive siloes of harm within that”.

The right moral thing to do, as with the rest of society, is to do what’s best to help those at risk.