OpenAI on 29th October amended its usage policies to stipulate that it cannot provide legal advice, with social media platforms awash with theories around what the implications of this will be.

The new terms say that you cannot use ChatGPT for the “provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional.”

While there are some on LinkedIn saying that the effect will be to direct users to legal-specific AI platform and/or licensed attorneys, the reality appears likely to be far less dramatic.

By warning users that they can’t use ChatGPT for legal advice, OpenAI is looking to limit its liability for when things go wrong. The reality is that in practice, ChatGPT is still undertaking legal activities. We asked for a contract to buy a car and ChatGPT asked if we wanted a formal purchase agreement including warranties, representations, indemnities, and signatures section, plus a bill of sale section at the end. We then asked if we had a claim for negligence if our boss dropped hot water on our foot. ChatGPT provided the criteria you need to prove negligence (duty of care/breach/causation/damage), what evidence to gather, practical next steps, and the specific legal framework and process. Our last question was whether ChatGPT can help us to get divorced and yes, it can (in theory) write the legal correspondence and also draft a proposed financial settlement letter or statement. It also provides relevant case law to support eg negligence claims. ChatGPT today (3 November) provided us with caselaw as to core employer-duty; burns/scalds fact patterns & foreseeability; res ipsa loquitur; and more.

Whether there is a further evolution of this will be watched closely. Please keep us posted. As things currently stand, the changes won’t prevent people from using ChatGPT for legal advice, regardless of what the terms say.

[email protected]

The post OpenAI changes ChatGPT’s usage policy to preclude legal advice appeared first on Legal IT Insider.

Read More