The introduction of analytical technology and artificial intelligence (AI) into the UK legal market has been greeted by many in the sector as a gamechanger, transforming the way lawyers process their data. Moreover, the introduction of AI may have brought us to the cusp of being able to predict the outcome of a litigation through the use of algorithms. Of critical importance within all of these developments, however, are the ethical and regulatory implications which are all too often overlooked.
Persistent throughout the discussion surrounding AI and Legal Tech is the overriding assumption that it will be humans, not ‘machines’, who will be subject to regulation. This has not been determined to any degree of certainty; suggestions regarding the regulation of AI systems themselves have already been made. Legal Tech must be conceptualised, developed and adopted only in a manner which preserves the rule of law and access to justice on which our democratic society depends. When it comes to promoting the rule of law and access to justice, it is crucial to apply care and scrutiny.
Many will agree that the opportunities presented by smart contracts are numerous and broadly positive. Further Legal Tech developments, however, such as automated document review and e-discovery, legal research, litigation predictive analysis and online dispute resolution, all have the potential to significantly impact a barrister’s profession. As a cross-cutting activity at the Bar, disclosure serves as a major source of work for junior practitioners starting out in tenancy. The junior Bar stands as a rapidly deployable, skilled and cost-effective resource for disclosure work; certain Legal Tech developments may disrupt this source of work. A reduction of work for the junior Bar poses a threat to the profession, and consequently to the justice system and society as a whole. Judges, furthermore, are appointed after years of practice as legal professionals and are vital for ensuring justice, fairness and equality which stand as pillars of our legal system. As the Chair of the Bar, Amanda Pinto QC, emphasised in her inaugural speech in December, for all three of these principles to be engaged, “a judge very often needs to be the arbiter. Replacing judicial decisions, which involve an evaluation of the merits of a case and the exercise of a discretion, with an algorithm is not justice”.
It is important to consider the risks and dangers presented by the rise of data-driven technology, predictive analytics and AI, in the absence of comprehensive regulation or legislative protections. The UK received a much-needed update with the advent of the Data Protection Act 2018, almost 20 years after its predecessor was passed into law. If the next update takes another 20 years to emerge, we may be staring down the barrel of a proliferation of technology to which legislation and regulation have not caught up. The R v Chief Constable of South Wales and ors  EWHC 2341 judgement brought up significant questions surrounding the accountability in law for facial recognition technology. In the case of litigation predictive analysis, it is not hard to imagine risks to the rule of law, if not regulated either by legislation or a specialist regulatory body.
Whilst welcoming the opportunities Legal Tech and AI present, we must refrain from adopting these innovations without the necessary due diligence both our lawyers and our democratic society deserve.
Melanie Mylvaganam, Bar Council Policy Manager: Regulatory Affairs, Law Reform and Ethics – Melanie’s policy areas include ethics, regulation in legal services, anti-money laundering and previously data and technology issues at the Bar.
Stuart McMillan, Bar Council Policy Analyst: Legal Practice and Remuneration - Stuart focuses on data, digitisation, and the effect of new technology on the Bar and the practice of barristers, including the impact of AI on the profession.