Legal technology, often known as ‘legaltech’, has emerged as an increasingly significant area of investment for entities that operate in the legal sector, from traditional law firms to alternative legal service providers (ALSPs) and in-house legal teams.
Legaltech focuses on increasing work efficiency by addressing important but low-value or high-volume work to provide more time for lawyers to work on tasks requiring a higher skill set.
This facilitates time and cost efficiency for lawyers who are now increasingly being asked by their clients to streamline their processes to cut costs. For example, AI can be used to aid contract review, client queries, statutory legal research, and document automation quickly while reducing the chances of human error.
Generative AI in the legal sector
Interest in generative AI in 2023 has also spurred professionals to explore and harness its capabilities in the legal sector. These range from legal document generation to predictive analytics to estimate case outcomes and use data to identify market trends, risks, and opportunities.
For example, in February 2023, Allen & Overy launched its generative AI platform, integrating Harvey (based on GPT-4 technology by OpenAI), into its global practice to help its lawyers with drafting legal documentation. In December 2023, Allen & Overy also created an AI contract negotiation tool, ContractMatrix, in partnership with Harvey. The firm estimates that the tool saves around seven hours when negotiating a contract. When discussing these innovations, David Wakeling, Head of the Markets Innovation Group at Allen & Overy explained that the firm’s goal was to “disrupt the legal market before someone disrupts us.”
Early adopters of generative AI within the legal industry have embraced generative AI tools by major companies like Microsoft and Google and as opposed to building their internal products, law firms are expected to primarily integrate large language models (LLMs) as add-ons within their existing technology stack.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below formBy GlobalData
Attitudes towards legaltech
Traditionally, law firms have carried a reputation for having a conservative approach to tech adoption due to the privacy and data integrity risks that come with it. While this mentality has been changing in recent years, it is clear that there are still reservations.
Generative AI systems pose several challenges which range from hallucination and bias to concerns around privacy and the spread of misinformation. Hallucinations occur when the technology provides incorrect or misleading information. Even with current use cases of the technology, like Harvey, lawyers are warned that they must fact-check any information generated by the tool and that its primary purpose is to create a starting point on documentation for lawyers to then edit.
Scepticism over how trustworthy output from generative AI can be was validated when two New York lawyers were fined in June 2023 after submitting a legal brief with six fake case citations generated by ChatGPT.
A survey was conducted in late March by the Thomson Reuters Institute which gathered insight from more than 440 respondent lawyers at large and midsize law firms in the United States, United Kingdom, and Canada.
When exploring attitudes towards ChatGPT and generative AI for legal work, 82% of respondents answered that generative AI or ChatGPT can be applied to legal work, yet 51% of these respondents said they believed it should be applied to legal work. This result can be attributed to the concerns many lawyers still share about the confidentiality and security of source material used to generate AI output, accuracy risks, as well as ethical objections.