Skip to main content

Most of us live our lives in cultural and political echo chambers. It’s nothing new and it’s completely natural. But over the past 15 years something has changed. As more research delves into the effects of social media on political behaviour, it is increasingly apparent that these changes are a much more subtle and pernicious risk to democracy than even the Cambridge Analytica episode suggests. 

Neil Kinnock’s defeat in 1992 exemplified the power our media once wielded to disseminate policy and persuade the electorate. It may have been the Sun “wot won it” back then but make no mistake, that time has passed. Parties’ policies and the media’s role in advocating them are no longer the driving force behind our elections. The data-scientists have inherited the earth. 

If our social environment has always played a major role in determining our vote, what is really different today? We’ve always lived with a partisan media and it’s natural to surround ourselves with people who believe what we believe. Surely nothing has fundamentally changed. 

The real difference is that digital social networks are unlike anything society has seen in the past. They are different in architecture, they are incredibly fluid and they are routinely manipulated by outside forces. 

Today, the average Briton spends over two hours a day on their smartphone, much of that using social media. Within that social media they will have, on average, over 150 social connections. So in order to maximise your engagement, the social media platforms shape which of your connections you see and interact with, creating an artificial perception of what is happening around you.

Today, around 80% of our electorate is spending over two hours per day in an artificial environment, constantly reshaped by an artificial intelligence specifically designed to maintain our engagement. This carries three significant risks: it shapes our cultural and political perception; it is being competitively manipulated by marketeers, digital socialites and others; and it can be used, more directly, to influence your vote. 

In 2019 scientists at Houston, MIT, UPenn and Oxford, led by Alexander J Stewart, began looking into the patterns of behavioural change that could be achieved by manipulating subjects’ social networks. Alarmingly, their discovery shows that votes can be manipulated without the need to change people’s cultural or political perception. They call this technique ‘information gerrymandering’. 

It turns out that knowing how others around you will vote is a key influence over your own vote – irrespective of your personal beliefs. 

Having a sense of how your friends and colleagues are likely to vote is nothing new, but the tests showed that over 10% of the vote could be skewed simply by changing the shape of a subject’s network of friends to artificially inflate the number of friends voting for the other side. It suggests that the more we are surrounded by people of a particular persuasion, the more easily we are able to rationalise changing our own vote. 

In the context of social networks like Facebook and Twitter it is important to understand that the process of manipulating your network is intentionally left wide open to outside influence. The algorithms that define each individual’s echo chamber are designed to crave that individual’s attention, and work towards monopolising it. By producing apolitical, entertaining and shareable content, I can train the algorithms to show my own content, gerrymandering the information you see. 

What Stewart et al. shows is that when the time is right, there doesn’t need to be a contextual shift in politics in order to swing your vote – you can be swung simply by how others in your field of view intend to vote. 

It has been 15 years since the now familiar ‘personal timelines’ of Twitter and Facebook came to our screens. In that time legislation has moved forward across Europe to begin to protect our personal privacy in the form of GDPR, PECR and forthcoming ePrivacy rules. However, nothing has been done to regulate the degree to which our perception of the world around us is being artificially manipulated by social platforms with their own commercial agendas. 

From an engineering standpoint, where legislation has been introduced it has often failed to curb the dark practices of technology companies. 

In the case of something as simple as third-party cookie tracking we have simply shifted responsibility to the consumer who typically has neither the skill, understanding nor patience to protect themselves. To access, for instance, Delia’s marmalade recipe you will be asked to accept tracking from 513 separate companies. 

Refusing consent is always harder than agreeing to it. We are all asked so incessantly that in this game of attrition most people simply give in. Ultimately the consent is meaningless. In 2024 this country will go back to the polls. But our vote is arguably no longer determined by the state of the economy, by party political messages or by our personal beliefs. The outcome of the 2024 election is already in the hands of the geeks who play the platforms’ algorithms to shape our social circles. 

Until we regulate the artificial construction of social echo chambers nothing will change. And for as long as the shape of my echo chamber can be manipulated, what you say and how you say it has no impact on my vote. Until then, all of your policies are just fake news.

Jim Morrison is founder of OneSub and owner of Deep Blue Sky. This article first appeared in our Centre Write magazine Digital disruption?. Views expressed in this article are those of the author, not necessarily those of Bright Blue.