The Home Office and Department for Digital, Culture, Media and Sport (DCMS) has published a new Code of Practice for providers of social media platforms which offers guidance on the appropriate management of bullying, intimidating, humiliating and insulting behaviours.

It is published in accordance with Section 103 of the Digital Economy Act 2017 by the Secretary of state and applies to providers of social media platforms and any sites hosting user-interactive content such as review websites and online marketplaces.

The Code of Practice is a response to the Internet Safety Strategy Green Paper in anticipation of the regulatory framework put forth in the Online Harms White Paper. The Online Harms White Paper states an intention to make “the UK to be the safest place in the world to go online” by utilising legislative and non-legislative measures to make social media platforms more responsible for users safety, including the introduction of a new duty of care towards users which will be monitored and enforced by an independent regulatory body.

KEY PRINCIPLES

The new Code of Practice is based on four key principles aimed at providing online safety for all users. There is a specific focus on equality and diversity with an acknowledgment that women, minority racial and religious groups, LGBT persons and disabled people are disproportionately at risk of harmful conduct online. The key principles state that social media platforms should:

– Maintain a clear and accessible reporting process to enable users to notify them of harmful content. This includes non-users such as parents and teachers;

– Maintain efficient processes for dealing with such notifications. This includes sending an acknowledgement that a report has been received within 24 hours and communicating appropriately with the specific users;

– Should have clear and accessible information about reporting processes in their platform’s terms and conditions. This includes clear definitions of what constitutes harmful conduct and rules prohibiting specific conduct;

– Give clear information to the public about the action they take about harmful conduct. This includes, if content is not removed, explaining why the reported content has not been removed.

At the moment the Code of Practice is mere guidance and social media platforms do not have to comply. However, it would be wise for such platforms to take heed and preparatory steps in amending their policies in preparation for any regulation change introduced by the Online Harms White Paper, which will become clearer when the consultation closes in July 2019. In particular, they may need to consider the Equality Act 2010 and that users, specifically vulnerable users, are protected from an ever growing threat.