Skip to main content

Disinformation or “fake news” has occupied much of the UK’s political and social discourse over recent years. Measures to control the issue must be reformed before the threat to online safety grows worse.

Disinformation is false information deliberately and often covertly spread to influence public opinion and obscure truth. People are often exposed to disinformation in the form of foreign propaganda, edited and doctored media, and through fake social media accounts which can cause psychological and physical harm if not addressed. For example, it can damage attitudes to politics, encourage decisions that damage the health of others, and can sometimes encourage bad financial decisions.  In an online-dependent world, raising awareness, providing better training in media literacy, and tackling disinformation should be a key priority for the UK government.

A recent Department of Digital, Culture, Media, and Sport (DCMS) inquiry in February 2019 found many instances of disinformation in UK media. Following the chemical nerve agent attack in Salisbury in 2018, the inquiry noted that the government had “judged the Russian state promulgated at least 38 false disinformation narratives around this criminal act”. Furthermore, Cardiff University and the Digital Forensics Lab of the Atlantic Council concluded that in January 2019, Facebook removed 289 pages and 75 accounts, with a combined following of 790,000, that had been run by employees from the Russian media firm Sputnik misidentifying themselves and spreading disinformation.

Ofcom reported that in order to tackle cases of disinformation, an improvement in media literacy levels is required. Media literacy refers to one’s ability to critically analyse the credibility, or evidence of bias in online content. To improve media literacy, DCMS introduced a £340,000 scheme to reskill media companies in avoiding harm online. A 2022 Ofcom report into Adults’ Media Use and Attitudes showed that a third of internet users were unaware of the potential for inaccurate information, 30% do not know whether the information they read is truthful and 65% agree that there is a greater need for better online protection.

The Online Safety Bill proposes new rules about removing disinformation that will be applied to firms such as Meta that work with user-generated content. The bill has been criticised for being too ambiguous, leading firms to over-censor content in a bid to comply with these vague new laws. Part 10 of the Bill addresses harmful communications defined as “psychological harm amounting to at least serious distress”. This definition does not provide clear conditions under which a regulator can identify disinformation or decide which pieces of content qualify as causing “serious distress” rather than one exercising free speech.

Ofcom will be less effective in overseeing the removal of increasing cases of disinformation when having to follow the vague protocols of the Bill that demand greater moderation by the Secretary of State. Glen Tarman, Head of Advocacy and Policy at Full Fact, has said that the Bill “would not be enough in a fast-moving environment”. Independent regulators should at least be granted freedom to decide on and enforce online safety measures that they believe provide effective results. These measures should be well-developed, algorithmic, and inclusive of a revised system of appeals to avoid infringements of free speech. The Bill would not allow Ofcom the ultimate authority to enforce decisions regarding online safety rules for media firms because it requires external moderation by a Secretary of State.

Additionally, the UK should adopt a more targeted approach to disabling disinformation by giving stronger support and funding to schools to improve media literacy. Increasing the amount of compulsory IT lessons in schools could drive up rates of media literacy and awareness of disinformation. High-quality training on this issue would raise student’s ability to identify and respond to harmful disinformation. Ofcom, as an independent regulator, should be able to enforce their online safety laws on user-generated media platforms and impose proportionate sanctions if those rules are breached without having a delayed external moderation of its policy. 

The UK needs higher-quality educational support and increased funding to raise rates of media literacy. With a third of internet users unaware of how to identify disinformation and 65% agreeing that the country needs better online protection, reform in schools seems the best way to raise awareness and knowledge of how to identify and report disinformation. Ofcom requires independent authority to impose necessary and relevant online safety rules that do not interfere with rights to freedom of speech without a backlog of external governmental supervision.

Hannah is currently undertaking work experience at Bright Blue. Views expressed in this article are those of the author, not necessarily those of Bright Blue. [Image: Pixabay]