Illustrating ChatGPT’s limitations, Hammond described how he prompted ChatGPT to generate a description of the MSAI program. The bot response skewed toward the norm by incorrectly stating that MSAI is a two-year degree program — the program is 15 months. This error is an example of bias, or an erroneous assumption, based on ChatGPT’s training data. The bot also omitted unique information about the program, namely MSAI’s industry partnerships.
“When you are looking for an answer that is best practice or conventional wisdom, those are marvelous places for statistical methods,” Hammond said. “But if you start wandering into the realm of the bespoke, or the unique, you’ll run into problems.”
Hammond stressed the importance of understanding the nature of a task and confining technologies to the tasks they were built to solve. He suggested a language model like ChatGPT might not, for example, be suitable for the task of determining how changing a clause in a contract will impact the document.
“You have to understand the length and breadth of the technology and where it collapses, and make sure the task is not one that demands something beyond its limits,” Hammond said. “ChatGPT might be good at taking a test. But, because of the nature of the underlying mechanism, it may never be capable of genuine reasoning, being imaginative, or thinking beyond the moment.”
Implications for law and legal services
McGinnis discussed his expectations regarding howal technologies like ChatGPT may affect legal services and law, including increasing computational efficiency, improving accuracy, and reducing costs.
He suggested that certain areas of law are more conservative and stable over time, like trust laws, might be more easily impacted by technology than edge cases and laws that are rapidly changing, like cybersecurity.
“I don’t think, at least in the foreseeable future, that AI tools will make lawyers obsolete. But they will be very important helpmates, as we have already seen with e-discovery and computerized legal search,” McGinnis said. “The question for lawyers and law students will be, how are you going to add value in a world where some of the simpler tasks are going to be taken away by machines?”
The panel agreed that, in the next five to 10 years, iterations of ChatGPT will focus on specialized domains — LawGPT, MedicineGPT, MarketingGPT — underscoring the issues around the need to evaluate, validate, and test the bot’s output given that, unlike search results , it is not feasible to see ChatGPT’s sources or citations.
Hammond predicted, in the short-term, a new “prompt engineer” role would emerge to improve the system and refine for specificity.
“Users who understand enough about a domain will engineer the appropriate prompts to guide the system in the right direction, and the prompts will become the learning drivers for the next generation,” Hammond said. “Learn to communicate at the level of the goals that you’re trying to achieve, because that is the language that you’re going to use to control these systems.”
- Kentucky Legal Aid to receive more than $3M to help December tornado survivors | News
- Legal Services Corp. awards funds to Iowa Legal Aid for 2020 derecho victims | Local News
- Justice Bus to bring legal aid to Tennessee residents
- Puerto Rico Legal Services receives $882k to help the victims of earthquake
- Legal Aid lawyers claim NYC fiddles with Rikers missed medical visit numbers to make them look better