Milton Keynes, At some point in your life, you are likely to need legal advice. A survey conducted in 2023 by the Law Society, the Legal Services Board and YouGov found that two-thirds of respondents had experienced a legal issue in the past four years.

The most common problems were employment, finance, welfare and benefits and consumer issues.

But not everyone can pay for legal advice. Of those survey respondents with legal problems, only 52 percent received professional help, 11 percent received assistance from others such as family and friends, and the remainder received no help.Many people turn to the internet for legal help. And now that we have access to artificial intelligence (AI) chatbots like ChatGPT, Google Bard, Microsoft Co-Pilot, and Cloud, you might be wondering about asking them legal questions.

These tools are powered by generative AI, which generates content when asked questions or instructions. They can quickly explain complex legal information in a straightforward, conversational style, but are they accurate?

We tested chatbots in a recent study published in the International Journal of Clinical Legal Education. We asked the same six legal questions on family, employment, consumer and housing law across ChatGee 3.5 (free version), ChatGee 4 (paid version), Microsoft Bing and Google Registered in Bard.The questions were ones we commonly receive in our free online law clinic at The Open University Law School.

We found that these tools can indeed provide legal advice, but the answers are not always reliable or accurate. Here are five common mistakes we saw:

1. Where is the law from? The first answers provided by chatbots were often based on US law. This was not often stated or made clear. Without legal information, the user will likely assume the laws relate to where he or she resides.The chatbot sometimes doesn't explain that the law varies depending on where you live.

This is particularly complicated in the UK, where laws differ between England and Wales, Scotland and Northern Ireland. For example, the law on house renting in Wales is different to that in Scotland, Northern Ireland and England, while the Scottish and English courts have different procedures for dealing with divorce and termination of civil partnerships.

If necessary, we used an additional question: “Is there any English law that covers this issue?” We had to use this instruction for most questions, and then the chatbot generated answers based on English law.2. old law

We also found that sometimes the answer to our question referred to old law, which has been replaced by new legal rules.For example, divorce law in England and Wales was changed in April 2022 to remove fault-based divorce.

Some responses cited outdated law. AI chatbots are trained on large amounts of data – we don't always know how current the data is, so it may not include the latest legal developments.3. bad advice

We found that most chatbots provided inaccurate or misleading advice when dealing with family and employment related queries. Responses to housing and consumer questions were better, but there were still gaps in responses.Sometimes, they forgot important aspects of the law, or misinterpreted it.

We found that the answers given by AI chatbots were well-written, which could have made them more convincing. Without legal knowledge, it is very difficult for someone to determine whether or not a given answer is correct and applies to their individual circumstances. Even though this technology is relatively new, people are increasingly relying on chatbots in court. Cases have already been reported. In a civil case in Manchester, a plaintiff representing himself in court allegedly presented hypothetical legal cases to support his argument.He said he had used Chatji to find cases.

4. Very common

In our study, the answers did not provide enough detail for someone to understand their legal issue and know how to resolve them. The answers provided information on a topic rather than specifically addressing a legal question. Interestingly, the AI ​​chatbots were better at suggesting practical, non-legal ways to solve a problem. Although this may be useful as a first step in resolving a problem, it does not always work, and legal steps may be required to enforce your rights.5. Pay to play

We found that ChatGPT4 (the paid version) was better than the free versions overall. This threatens to further reinforce digital and legal inequality. Technology is evolving, and there may come a time when AI chatbots will be better able to provide legal advice. Until then, people need to be aware of the risks when using these to solve their legal problems. Other sources of assistance such as Citizens Advice will provide the latest, accurate information and will be in a better position to assist.All chatbots answered our questions, but in their responses said it was not their job to provide legal advice and recommended seeking professional help. After conducting this study, we recommend it.(Conversation)

GRSGRS