The Queen’s Speech on 11 May included the government’s long-awaited Online Safety Bill – a draft of which was published the following day. This is a major piece of legislation with significant implications for the regulation of digital technology companies in the UK. However, the publication of the draft bill has confirmed the absence of measures that were initially put forward to protect democracy from online harm. The Online Harms white paper published in April 2019 set out a position that recognised the dangers that digital technology could pose to democracy and proposed measures to tackle them. This was followed by an initial consultation response in February 2020 and a full response in December. In the course of the policy’s development, the democracy aspect of the proposals disappeared. The government now points instead to other areas of activity. There has been a notable shift away from the ambition of the white paper, which promised to address online harms ‘in a single and coherent way.’
Online Harms white paper: April 2019
The white paper first put forward the government’s intention for a statutory duty of care that would make companies responsible for harms caused on their platforms. This would include illegal harmful content, such as child abuse and terrorist material, but also some forms of harmful but legal content, including disinformation and misinformation. The white paper explicitly framed some of its proposals for tackling online harms in relation to the consequences for democracy. It detailed some of the harms that can be caused, including the manipulation of individual voters through micro-targeting, deepfakes, and concerted disinformation campaigns. It concluded that online platforms are ‘inherently vulnerable to the efforts of a few to manipulate and confuse the information environment for nefarious purposes, including undermining trust’. It recognised that there is a distinction to be drawn between legitimate influence and illegitimate manipulation.
The white paper also set out what the government expected to be in the regulators’ Code of Practice, and what would be required to fulfil the duty of care. This included: using fact-checking services, particularly during election periods; limiting the visibility of disputed content; promoting authoritative news sources and diverse news content; and processes to tackle those who misrepresent their identity to spread disinformation. It stated that action is needed to combat the spread of false and misleading information in part because it can ‘damage our trust in our democratic institutions, including Parliament.’
Initial consultation response: February 2020
The government’s initial response to the consultation on the white paper, held between April and July 2019, marked a shift in emphasis. It reframed the online threat to democracy purely as a possible encroachment of freedom of expression. It was also unclear about whether disinformation would in fact be in scope of the legislation at all, and commented only that, ‘Many civil society organisations also raised concerns about the inclusion of harms which are harder to identify, such as disinformation’.
When asked in the House of Commons whether the proposals still covered threats to democracy, the junior minister at the Department for Digital, Culture, Media and Sport (DCMS), Matt Warman, said, ‘The work that the Cabinet Office is doing on protecting democracy is a hugely important, albeit complementary, part of the process, rather than something that is covered by online harms.’ This change in policy direction received relatively little comment at the time. It is worth also noting that Jeremy Wright, who has since become a vocal backbencher on this issue, was sacked as Secretary of State for DCMS when Boris Johnson became Prime Minister in July 2019 – between the publication of the white paper and the initial consultation response.
Democracy and Digital Technologies Committee: June 2019–June 2020
In June 2019, the House of Lords appointed a select committee, chaired by Lord (David) Puttnam, to examine the relationship between digital technology and democracy. Its report Digital Technology and the Resurrection of Trust made a number of recommendations for the online safety legislation. The report praised the direction of travel outlined in the Online Harms white paper – with its recognition that online platforms could be used to undermine trust in democratic institutions – but was clear that the legislation must include misinformation and disinformation to be effective in this respect. The committee also proposed that the duty of care should explicitly encompass harms to democracy as well as to the individual.
The government’s response to the committee’s report stated that additional information regarding the remit of the duty of care would follow in the full consultation response. However, it implied that this duty would not extend to actions which undermine democracy, as democracy-related issues were being taken forward as part of the Cabinet Office-led Defending Democracy programme. It was left open as to whether tackling some forms of disinformation and misinformation would remain part of the proposals.
Full consultation response: December 2020
The full consultation response confirmed that disinformation and misinformation were still in the picture – but only that which ‘could cause significant harm to an individual’, such as anti-vaccination content. It made clear that contrary to the recommendations of the Democracy and Digital Technologies Committee, the duty of care would apply only to individuals – harms to minority groups and democracy would not be within scope. Much of the ambition of the white paper had gone from the plans. Although still an important development in the government’s approach to digital technology companies, the Online Safety Bill would no longer be a major vehicle for tackling the challenges that online platforms pose to democracy.
This did not go totally unnoticed by parliamentarians at the time. In the House of Lords debate on the full consultation response, Lord McNally expressed his frustration that the government were ‘choosing to ignore’ the recommendations of the Democracy and Digital Technologies Committee report. Angela Eagle asked in the Commons, ‘What action would the Bill take to defend our democratic values if it was on the statute book now?’ In answer to this, the Defending Democracy programme was again referenced instead.
The Defending Democracy programme
The Defending Democracy programme was announced in July 2019 as a cross-Whitehall initiative bringing together civil society, intelligence and government departments with four priorities for democracy: protect, strengthen, respect, promote. The programme covers electoral integrity and related online transparency issues. A technical consultation on extending imprints to digital campaign material – so people can see who is behind paid-for political content – ran between August and November 2020. And a ‘Counter Disinformation Cell’ was set up in March 2020 to deal with coronavirus-related false narratives. However, thus far, the work being taken forward by the programme does not appear to cover or compensate for much of what was initially proposed in the Online Harms white paper. Furthermore, the Defending Democracy programme was described by the Intelligence and Security Committee’s report into Russia as ‘rather fragmented’ and having been ‘afforded a rather low priority.’
The end of the era of self-regulation?
In February 2019, at the outset of the process outlined above, then Secretary of State Jeremy Wright announced in the Commons that ‘the era of self-regulation of the internet must end.’ Facebook CEO Mark Zuckerberg added his voice to those calling for greater regulation of the internet, writing in the Washington Post that he had ‘come to believe that [Facebook] shouldn’t make so many important decisions about speech on our own.’
However, when it comes to communications that can undermine trust in democracy, the government has largely abandoned the field. The substantial measures mentioned in the white paper to tackle political disinformation and promote fact-checking and trusted sources are absent from the draft Online Safety Bill. The government’s position is now that: ‘Policy or political arguments – both online and offline – which can be rebutted by rival campaigners as part of the normal course of political debate are not regulated and the government does not support such regulation. It is a matter for voters to decide whether they consider materials to be accurate or not.’
The draft legislation contains new provisions that reflect this position. Major platforms will have a duty to protect content defined as ‘democratically important’ in order to ‘uphold democratic debate online’. This is defined broadly as content ‘intended to contribute to democratic political debate in the United Kingdom’. These kinds of measures have been pushed for by Republicans in the US, especially since Donald Trump was banned from several major social media platforms.
Yet the Capitol Hill riot in January – where political disinformation and wild conspiracy theories sowed the seeds of unrest – vividly demonstrated the possible consequences of adopting a laissez-faire approach. The legislation will now be scrutinised by a joint committee, which will provide an opportunity for MPs and Peers to carefully examine the potential consequences of the proposed approach. As it stands, the government’s proposals also leave open the possibility of extending the duty of care through secondary legislation – which could potentially include online harms to democracy. It would be preferable, however, to act pre-emptively, rather than waiting for events to force the government’s hand.
This is an edited version of an article which originally appeared on the Constitution Unit blog. It is republished with their permission. The original can be found here.
Alex Walker is The Constitution Society’s Communications Manager. He manages, edits and contributes to the blog.
The Constitution Society is committed to the promotion of informed debate and is politically impartial. Any views expressed in this article are the personal views of the author and not those of The Constitution Society.