Let’s Chat ChatGPT....
On 30 November 2022, a company called OpenAI took the technology world by storm, launching a tech-changing AI chatbot which has the ability to near-instantaneously explain complex concepts for a range of various topics and formulate nice and easy answers from scratch. This is why the legal industry has been checking its potential and there is plenty of speculation regarding its impact.
How does it Function?
Chat GPT fully known as 'Chat Generative Pre-trained Transformer' is an AI tool which uses a large language trained by OpenAI and relies on a training method known as reinforcement learning from human feedback.
However, for ChatGPT to generate any kind of content, it first has to be asked an appropriate question. There are two circumstances where it can be used. The first one is asking specific commands where the intention is to generate entirely new content and the second is specific data, content or text input asked to modify, expand or correct data which is protected or subject to special laws such as personal data.
The chatbot is specifically trained, through the use of machine learning algorithms, to analyse large sums of data and recognise language patterns and structures. Its model is designed to answer to human input in a conversational way and generate human-like text based on the context of the conversation.
Its capacity to answer follow-up questions, challenge its own mistakes, and learn from past conversations makes it a truly innovative instrument in the tech realms. It can be used in various dialogue applications such as virtual assistants or customer service.
With this perspective, Chat GPT can be used ‘limitlessly’ and according to the range of provided information and how it is provided, it can involve different risks for users.
Opportunities in Law?
The use of Chat GPT is flexible, and complicated matters such as explaining coding software, to useful suggestions such as providing a recipe for what is left over in your fridge.
The use of AI is not new in the legal scene and Chat GPT is no different. Chat GPT has been fed information through varied sources such as the internet, books, articles, magazines, etc., that has helped it learn what “usually” will come after a word – it is completing information based on what others have published in the various sources from which it has been fed from.
Through this, they have long penetrated the legal world, notably through research and legal databases. Some even suggest that Chat GPT could be used to identify litigation strategies, review certain contracts and even draft legal documentation.
Lawyer’s work depends hugely on up-to-date research and information to complete a well-founded argument and giving daily advice, which is why there are certain limits that should be taken into account. OpenAI has pointed out that Chat GPT cannot provide accurate information beyond 2021, which corresponds to its knowledge cutoff, or time stamp. As the
laws and guidance change constantly, Chat GPT would not have access to any of the most current and updated legal data sites, therefore the information provided will most likely be inaccurate and out of date. Furthermore, ChatGPT requires critical thinking especially when the task given does not merely entail summarising information, it is more prone to provide false information or incomplete feedback that still remains plausible.
Even though Chat GPT answers can sound convincing, the capabilities of the platform are still very limited. In this respect, the system is not designed to give any advice. Lawyers should be aware that Chat GPT is no legal expert and in the legal field, error raises the question of responsibility: What happens when a wrong legal solution is given to a client? The question’s answer becomes more complicated and relevant for lawyers who practice in different jurisdictions and the law varies between those jurisdictions.
Solicitors are tied by their duties to the profession including the duty of diligence and competence. They have their own civil liability in case of professional misconduct. It is the solicitor that bears the prejudice caused by incorrect or misleading legal advice. They as the professionals are held accountable and penalised, not the chatbot used.
What are the Legal Implication and Risks?
Although Chat GPT bears significant innovative benefits, it is by no means a perfect system and is subject to limitations. Primary legal concerns range from copyright issues to cyber-security risks.
Confidentiality and Ethical Concerns
This is the most important limitation as stated before, Chat GPT works by returning answers to “tips” given by humans. Chat GPT uses these tips to provide endless capabilities which can range from answering a simple question to loading a document and asking it to observe it under a certain jurisdiction.
According to OpenAI, the prompts entered can also be used as training purposes for OpenAI. They also recommend that users do not to share any sensitive information1 in the tips as it is likely that these tips will be fed into the system and at some point be used as answers for other users, which could lead to a breach of confidentiality dispute.
Personal Information We Receive Automatically form Your Use of the Services.
OpenAI collects information just like other apps do like name, contact information, account credentials, payment card information and transaction history. They state that: “When you use our Services, we may collect Personal Information that is included in the input, file uploads, or feedback that you provide to our Services (“Content”)”2.
Personal Information You Provide.
OpenAI states that they “may automatically collect information about your use of the Services, such as the types of content that you view or engage with, the features you use and the actions you take [...]3”.
In addition to the above, their policy also states that they may use this information to i) provide, administer, maintain and/or analyze the Services; ii) improve our services and conduct research; iii) communicate with you; iv) develop new programs and services...4. This suggests that it is evident that OpenAI has access to whatever the user inputs into the prompt even though it is unclear what information specifically can be collected and how it can be used. It is evident that OpenAI has access to whatever the users inputs into the prompt.
This caused an uproar earlier this year when one of the biggest companies were caught in the middle of three separate confidentiality scandals involving Chat GPT. The employees of the internationally recognised industry leader in technology, Samsung, were caught feeding sensitive information into the Chat GPT prompts. In the first two cases, employees shared confidential source code into Chat GPT asking it to find a fix for it whereas in the third instance, they shared confidential meeting notes asking for it to prepare a meeting minute out of it5. On a separate but similar occasion, Chat GPT suffered a security breach where users could see other users conversations which triggered concerns in particular with the European regulators6.
Lawyers handle sensitive information on a daily basis, either with the firm or with clients. If this information was to be revealed, it would have an impact on the individual client but also hugely on the business. It is therefore mandatory that lawyers handle the information they are given with extra care whether it is information they have access to or it is shared.
With regards to ethical duties, each jurisdiction have their own guidelines and set of rules pertaining to ethics and the process around it. In saying this, it is likely that all jurisdictions would have the same process of maintaining confidential information and providing competent legal advice to clients. Using Chat GPT with no proper legal analysis and human judgement could potentially breach ethical duties for the client, the lawyer and firm involved.
There is also an issue and uncertainty around who owns the written content generated by Chat GPT. One could argue that the copyright should belong to its original owners of the data. However, as the chatbot is trained on vast quantities of text from an array of different sources, this creates an unrecognisable pool of original content resulting in the original owner not being identified.
“As between the parties and to the extent permitted by applicable law, you own all Input, and subject to your compliance with these Terms, OpenAI hereby assigns to you all its right, title and interest in and to Output”7.
The wording of the above suggests that users can use the generated content for any purpose subsequently involving the risk that the same content can be generated for other users who ask similar questions. In either suggestion, one would have to ensure that the AI is actually creating new content. Users should be aware of such issues and use AI-generated
outputs as a source of inspiration instead of reproducing them precisely.
Chat GPT is able to code instantly, which could prove as a useful error and vulnerability detection tool in complex code. One advantage of this is that it can become powerful cyber-security material and could be used to monitor chat conversations for suspicious activity. One negative is that there is a concern that Chat GPT may make a cyber criminal’s life easier as it has the ability to impersonate others, write flawless text, and create code
which can be misused by anyone with malicious intent.
If its capabilities can create a market in cyber incident management for cyber teams through simulations, it can also open up opportunities for malware development, ransomware, BEC attacks, phishing attacks, spam, and impersonation attempts. Its code analysis could be used to guide hackers and fuel a climate which is already very hacker-friendly.
Whilst it is unlikely that Chat GPT will steal your data, anyone secretly monitoring your conversation with a chatbot could invade your security and privacy. Chat GPT’s backdoor capabilities must be properly understood and reigned in to limit fallout. It is thus essential to always be cognisant of the information shored when engaging with Chat GPT and not to share confidential information such as your name, address etc. In addition adopting certain steps such as up-to-date software, firewalls, network detection and response, antivirus will also assist in keeping your personal data, firm’s data or client’s data secure from any conversational AI system.
Is there a link between Chat GPT and GDPR?
We still do not know what method is used by OpenAI to collect the data that Chat GPT is based on. However, we do know that such data is derived to a large extent from sources available on the Internet.
Naturally, a portion of the data gathered on the Internet may qualify as personal data. Like all processing activities, web scraping is regulated by the GDPR and, depending on national laws, may be subject to strict conditions to be legally implemented.
On March 31st, the Italian Data Protection Authority issued a ban on Chat GPT’s use and stated that: "no information is provided to users and data subjects whose data are collected by Open AI; more importantly, there appears to be no legal basis underpinning the massive collection and processing of personal data in order to ‘train’ the algorithms on which the platform relies"8. As such, a number of issues appear to remain unresolved regarding Chat GPT’s compliance with the GDPR.
Data protection challenges also apply to interactions between Chat GPT and its users: a recent bug in an open source library led to the exposure of conversation titles to other users9. It is important to note that no sensitive data should be shared in conversations, which is also recommended by OpenAI.
The advent of artificial intelligence is not new to the world, wherein different companies continue to work towards making the technology more economically feasible and user accessible. Chat GPT has so far proven to be a resource friendly tool, enabling improved time and cost management.
Various law firms have started initiating the process of introducing artificial intelligence systems within their functioning, an example of which is ‘Harvey’ a generative AI developed entirely for complicated legal issues faced by law firms in their cases. However, legal teams within any business considering its use should make sure they have appropriate safeguards
in place to govern who has access to the tool, what information can be submitted, and how the output can be used.
A lawyer’s work is highly specialised and requires multiple years of studying, training and accumulated experience to deliver results and client satisfaction. There is no doubt that lawyers will have to work with AI if they want to remain
competitive, but this use must be done in a responsible manner and in observance of ethical obligations.
Chat GPT cannot replace expertise, judgement and experience and when it is being used, lawyers should never rely upon Chat GPT as the only source of information but always cross-reference with other sources and consult when unsure to ensure the information they are providing is indeed up-to-date, accurate but most importantly applicable to the specific
By Puneet Kaur
Senior Solicitor and Notary Public
BA Law (Hons), LLB, Dip LP, NP
Puneet's Linkedin profile can be found here.
1 Butler, Sydney, “How to (Temporarily) Add New Knowledge to ChatGPT,” How-To Geek, April 2023
2 What Is ChatGPT? | OpenAI Help Center
3 Butler, Sydney, and Jordan Gloor, “6 Reasons ChatGPT Is Giving You the Wrong Answers,” How-To Geek, April 2023.
4 Dreyer, Chris, “AI For Lawyers: Transform Your Legal Practice With AI Tools,” Rankings, 26 May 2023
5 Dreyer, Chris, “AI For Lawyers: Transform Your Legal Practice With AI Tools,” Rankings, 26 May 2023
News, 23 Jan. 2023
Views expressed in guest posts are those of the authors and do not necessarily reflect those of The Scottish Lawyer.
The copyright of this work is owned by The Scottish Lawyer. Please do not republish this work without permission. If you wish to re-post this work either in whole or in part, please contact firstname.lastname@example.org