This is part 3 of 3 on a series on Big Data and crime prevention, and will consider how citizens are protected from being harmed by their personal data and if there are additional actions law enforcement and the Courts can take to reduce data’s potential harm. 

In the United Kingdom, the continued usage of potentially inaccurate data patterns and predictions has spurred debate. Some refute the identified bias and believe that data sets and predictive analytics always reflect the objective truth. Others see profiling as an unavoidable and ugly ethical and legal dilemma raising concerns about whether citizens’ rights are being adequately protected. So, what checks and balances exist to provide protection against personal data being used by law enforcement and the courts in a way that causes harm?

In 2009, the Coroners and Justice Act created the Sentencing Council, replacing the dated Sentencing Guidelines Council. The new Council, like its predecessor, monitors sentencing practices and appoints guiding principles but branches out in the form of placing an arguably increased focus on strengthening the public’s faith in the criminal justice system. Campaigning by entities such as the Criminal Justice Alliance have contributed to increased public awareness surrounding Big Data’s potential negative impact on offenders and society. A judge who confers harsher criminal sentences to people from certain backgrounds is perceived by the public to be exercising bias. It is inconsequential if those notions are actually held by the judge himself or merely caused by inaccurate predictions. To restore public confidence in the judicial system, the Sentencing Council is likely to find the judge liable for and in breach of the Coroners and Justice Act.

Police forces are expected to manage and use data in accordance with the relevant legislation and policies including the Data Protection Act (DPA) 1998 and the incoming General Data Protection Regulation (GDPR).

Under the DPA, individuals are granted the right to be protected from distress occurring from the processing of data as well as against profiling. It is possible that by using biased data, law enforcement conducts unfair processing, a clear breach of the DPA. Conversely, while profiling by the police has been revealed to have negative ramifications, the police are not necessarily breaching current legislation by their application of this practice. Law enforcement is safeguarded against accusations of breach arising from profiling due to the existence of the public benefit exemption. This exemption is exercised by individual police departments upon their communication to the Office of the Information Commissioner of their intent to utilise this justification in the pursuit of criminal justice.

The GDPR sets out stricter regulations than its predecessor, especially concerning the collection, storage, and usage of data. For example, the GDPR adapts a more restrictive approach to consent. Consent granted must be ‘specific to the distinct purposes of processing’ and additionally, data subjects are able to revoke their consent at any time. This poses a challenge to current big data practices as often, the data being analysed in the present day has been collected for a primary purpose unrelated to the secondary purpose it subsequently provides.

The GDPR introduces safeguards for individuals against the risk that a potentially damaging decision is taken without human intervention and solely through automated processing and profiling. These rights work in a similar way to existing rights under the DPA. Individuals have the right not to be subject to a decision when it is based on automated processing and a legal effect or a similarly significant effect on the individual is produced. Individuals subject to automated processing must be able to:

  • obtain human intervention;
  • express their point of view; and
  • obtain an explanation of the decision and challenge it.

However, this right does not apply if the purpose is authorised by law (e.g. for the purposes of fraud or tax evasion prevention) so law enforcement may still be able to justify their use of such technology.

The GDPR defines profiling as any form of automated processing intended to evaluate certain personal aspects of an individual. When processing personal data for profiling purposes, it must be ensured that:

  • the processing is fair and transparent by providing meaningful information about the logic involved, as well as the significance and the envisaged consequences;
  • it uses appropriate mathematical or statistical procedures for the profiling;
  • it implements appropriate technical and organisational measures to enable inaccuracies to be corrected and minimise the risk of errors; and
  • it secures personal data in a way that is proportionate to the risk to the interests and rights of the individual and prevents discriminatory effects.

However, again these rules can be circumvented if the processing is necessary for reasons of substantial public interest on the basis of EU / Member State law. This must be proportionate to the aim pursued, respect the essence of the right to data protection, and provide suitable and specific measures to safeguard fundamental rights and the interests of the individual. The conceivable conflict between permissions established by this exemption and future consent conventions will be interesting to observe. Would defined consent rules limit the police’s usage of data in the practice of predictive policing? Or would these rules not expressly apply to the police force as the profiling is ‘justified’ in the name of protecting society?

Even in light of stricter regulations found in the GDPR, the current big data revolution is not at risk of dwindling anytime soon. For that reason, it is imperative that the unfair and biased outcomes caused by problematic data and problematic processing is actively recognised and controlled. While the complete elimination of bias is wholly unfeasible, entities handling the data should take thorough measures to ensure that data sets being processed are as refined as can be. In addition, police forces and the courts should work to reduce the adverse effects triggered by the presence of bias.

One way this can be achieved is by rejecting the present-day’s overwhelmingly quantitative analysis and understanding of the data in favour of adopting a combined quantitative and qualitative approach. Esteemed researcher Michael Patton once said, ‘qualitative data puts flesh on the bones of quantitative results’.

Conducting basic qualitative research by asking questions like: ‘how was the data collected?’ and ‘who contributed to the data?’, goes a long way to provide a contextual background as well as indispensable depth to the numbers. The presence and possession of this additional information not only functions to deter blind dependence upon quantitatively discovered patterns but also contributes to ensuring that predictions are more accurate and reliable. The resulting increase in trustworthy predictions would help to transform policing and criminal sentencing.

For more, see articles one and two.