MPs in the UK have called for a new Social Media Regulator to tackle how data is collected and sold to third parties, which produces an influx of misinformation affecting election outcomes.
The focus of the report heavily criticises Facebook for “intentionally and knowingly” infringing data protection laws and allowing unidentified third parties to target users with adverts aimed to manipulate their voting preferences.
The report was designed to investigate “disinformation” and “misinformation” (and chose specifically to avoid the term “fake news”) occurring on social media, in the wake of the Cambridge Analytica scandal, which allowed the political marketing firm to exploit Facebook users’ personal data, including voting patterns of millions in the USA.
Use of information like this has been linked to political events deemed by some as less likely than not, including the success of Donald Trump’s presidential campaign and the Brexit vote in the UK in 2016.
The Report’s Commentary
Whilst Facebook and other social media companies have promoted self-regulation in the fields of data protection, copyright infringement and prevention of material relating to terrorism, the 18-month long inquiry has deemed that Facebook is no longer capable of self-policing, noting in its conclusions that “clear legal liabilities should be established for tech companies to act against harmful or illegal content on their sites”.
Head of Facebook, Mark Zuckerberg, was criticised personally, not only for failing to attend a Parliamentary enquiry into Cambridge Analytica, but was lambasted for Facebook’s response during the inquiry which had “deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions”
Ultimately the issue was the handling and selling of Facebook users’ personal data with the report concluding that “[t]he Information Commissioner told the Committee that Facebook needs to significantly change its business model and its practices to maintain trust”.
This aligns with the report’s recommendation that the UK’s competitions authorities begin their own investigations into Facebook on the basis that in December 2018, Facebook offered preferential access to data to particular companies.
Alongside the £500,000 fine from the Information Commissioner’s Office for the Cambridge Analytica scandal, the report notes “it is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws. The ICO should carry out a detailed investigation into the practices of the Facebook Platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data.”
Electoral Law Reform
The report notes the impact the misuse of personal data has on democratic society and recognises that “[e]lectoral law is not fit for purpose and needs to be changed to reflect changes in campaigning techniques”.
MPs have called for government action on the manipulation of voters and targeted political campaigning, acknowledging that “democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms” and have recognised the need to establish a duty of care owed to users.
Such ideas have been considered in continental Europe, as social media companies are required to take down hate speech in Germany, whilst French judges can order the removal of fake news.
The report makes five key recommendations to hinder the spread misinformation.
1. The first step requires the establishment of a compulsory Code of Ethics created “by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.”
2. Secondly the Social Media Regulator would have the authority to commence legal proceedings if any entity were in breach of the Code of Ethics.
3. Such companies would also be required to take down any deemed harmful content or confirmed sources of misinformation.
4. Social media companies would then be taxed, revenue of which would fund the new Social Media Regulator. The report reiterates the Chancellor’s 2018 budget which aimed “to impose a new 2% digital services tax on UK revenues of big technology companies from April 2020”.
5. Finally the report urged the government to reform electoral laws, as well as the involvement of foreign involvement in UK elections, stating “[t]he Government needs to ensure that social media companies share information they have about foreign interference on their sites—including who has paid for political adverts, who has seen the adverts, and who has clicked on the adverts—with the threat of financial liability if such information is not forthcoming.”
Despite criticism of its cooperation into the investigation, Facebook have stated their concern for the circulation of false news and the integrity of electoral systems, noting that they have made a “significant contribution” to the investigation, and welcome “meaningful regulation”.
Given the financial, reputational and restrictive burdens recommended, Facebook will unlikely be pleased with the report’s contents. Ultimately, the report summarises the issue of how these companies interact with their users, stating “[s]ocial media companies need to be more transparent about their own sites, and how they work. Rather than hiding behind complex agreements, they should be informing users of how their sites work”.