Our clients are grappling with how to best incorporate artificial intelligence (AI) into their own businesses. From a legal perspective ChatGPT has a range of limitations that need to be considered when leveraging the technology to create efficiencies and boost productivity in your business.

The basics: What is ChatGPT?

ChatGPT is a chatbot application that uses generative AI. It processes natural language. If you provide a text input in the form of a question, statement or command, it can generate a response in a conversation style. The reason it is currently generating so much hype is its remarkable level of performance in its text generating ability. It can use very complex language, “read between the lines” and produce new content.

Limitations from a legal perspective

ChatGPT has a number of limitations that users need to be aware of. For example, whilst it may be useful in tasks that do not require it to ‘think’ critically or exercise good judgment (such as drafting a proforma “mutual release clause in a deed of settlement” or proof-reading an email) it does not yet have the ability perform ‘higher order’ tasks (such as drafting submissions that address the nuances of a matter or respond to issues raised by the other party or provide reliable legal advice).

Limitations with the quality of the output

“Hallucination”: ChatGPT has been known to make up content, including fake case citations. It is difficult to identify these issues when ChatGPT does not demonstrate how it has achieved a response.

Biased towards more frequently cited information: ChatGPT prefers older and more cited cases in formulating a response and may disregard newer cases that may be more relevant in application or may have changed the law.

Challenges with jurisdictions: ChatGPT may not recognise the legal frameworks of specific regions.

Currently no real-time updates: ChatGPT’s training data (on the free version 3.5 model) is cut off at September 2021, and it is unable to account for more recent developments to the law.

Confidentiality

The creators of ChatGPT, Open AI, have made it clear that; “We are not able to delete specific prompts from your history. Please don’t share any sensitive information in your conversations.” To the extent that solicitor-client information or personal or commercially sensitive information is uploaded to ChatGPT, it could breach particular obligations that may be owed including:

  1. Rule 9 of the Solicitor Conduct Rules – which prevents a solicitor from disclosing information
  2. Australian Privacy Principles 1, 6, 8 and 11 which regulate how personal information can be used (generally for the primary purpose it was collected), disclosed, and kept secure (including in a cross-border context).

Professional ethical obligations

The use of ChatGPT may also cross various professional obligations to act honestly or fairly.

Firstly, ChatGPT is a chatbot that does not have any ethical or moral judgment when it synthesises data to produce a response. Further, in providing a response, it is constrained by the information in its database to achieve its answer. To the extent that its data set contains biases, there is no way of identifying or correcting that bias.

Secondly, the output is not transparent. ChatGPT does not disclose how it has achieved an answer to a question, and therefore the correctness of the output is difficult to monitor.

These limitations raise concerns for a range of professions and industries that owe clients and customers particular ethical obligations, including, for example:

  1. Lawyers, who have a range of fundamental ethical duties under rule 4.1 of the Solicitor Conduct Rules, including an obligation to avoid any comprise to their integrity and professional independence
  2. A financial service licensee, who has an obligation to provide financial services covered by the license “efficiently, honestly and fairly”
  3. Accountants who have a duty to exercise professional competence and due care
  4. Government or statutory bodies who are generally required to comply with different Codes of Conduct related to acting with care, diligence and integrity, complying with applicable Australian laws and ensuring transparency and accountability when making decisions.

Finally, in some circumstances, the use of ChatGPT may breach the law: including the Privacy Act 1988 (Cth) and intellectual property law.

What should you do now?

Businesses should implement policies regarding the use of ChatGPT (and AI more generally). At a minimum, any policy should address the following: 

  1. Risk based approach: Clear guidelines as to when and how ChatGPT/AI can be used
  2. Accountability: Specify data that should not be inputted: i.e. personal, client, confidential or commercially sensitive information
  3. Transparency: Acknowledge if AI has been used in a response
  4. Education: Consider training and education to employees.

Further information / assistance regarding the issues raised in this article is available from the authors, Megan Palmer, Partner and Natalie Oliver, Special Counsel, or your usual contact at Moray & Agnew.