AI and the Law: Why Should it Matter to You?

Whether you keep up with the latest technology or, like me, you still wear an analog watch, we’re all relatively aware of AI. AI, short for “artificial intelligence,” seems like a novel idea that we’ve only just seen making news headlines, but we can trace the phraseology to a British mathematician and logician named Alan Turing. In 1950, Turing published a paper titled “Computing Machinery and Intelligence,” wherein he asked the question, “Can machines think?” In sum, Turing wrote that in the same way you and I take in information and respond to questions, a computer may be able to do this as well.*

Today, computers are readily available and capable of some truly amazing things. In 2022, a judge in Colombia utilized ChatGPT to assist in drafting a legal ruling. He stated he used ChatGPT’s full responses along with his own insights to “speed up” drafting the decision. It’s worth noting that in Colombia, the laws don’t prohibit the use of AI in court decisions. Recent news reports show us that ChatGPT and similar AI bots are authoring papers for college students, passing law school and Bar exams, and “chatting” with human beings. So why aren’t more lawyers using AI to draft documents and provide legal answers and advice? To answer that question, it is important to have a basic understanding of how AI works.**

The SAS Institute, a developer of analytics software, states on its website that AI operates by “…combining large amounts of data with fast, iterative processing and intelligent algorithms, allowing the software to learn automatically from patterns or features in the data.” AI systems automate processes and make predictions; however, they are subject to the same limitations its human programmers experience. Let’s consider two AI issues—bias and misinformation--and how these could impact your estate planning. ***

I. Bias

In November 2019, Frontline interviewed MIT graduate researcher Joy Buolamwini regarding a problem many programmers had hoped AI would help us avoid - bias. Buolamwini had built a robot she was programming to play “peek-a-boo” with her. She discovered that the robot would easily distinguish her lighter-skinned friends but could not detect her face on many occasions. She went on to create other programs using facial recognition technology and realized that darker skin tones were not being recognized by the algorithms and they were also often labeling darker skinned women as males. Her research eventually led her to the source of the data sets that the AI systems were accessing and found that most were outdated in terms of representation. Although people have become more diverse, and more women are working in science and technology, Buolamwini found, many of the benchmarks of facial recognition data were outdated and skewed. ****

Because AI pulls from data sets that can be outdated and, therefore, may not be representative of the population, legal advice may be equally outdated. For example, many of today’s families are blended with stepchildren, adopted children, and relatives that are not related by blood or marriage. As attorneys, we have a responsibility to understand the latest changes and assess these to provide timely advice to clients as well as potential issues they could experience. For example, what may happen if legal protections for same-sex marriage are repealed? What happens if a step-parent chooses to disinherit those step-children after the passing of his or her spouse? An experienced attorney is the only one qualified to answer these questions and help you to plan ahead.

II. Misinformation

One of the most concerning problem with AI may be the “human in the loop.” The Caltech Science Exchange discussed this issue with researchers. Some researchers compared AI systems to self-driving cars. Self-driving cars must navigate roads full of obstacles and human drivers whose behaviors are difficult to predict. The car’s decisions are only as accurate as the instructions the programmers initially provided until it learns from its mistakes. While most researchers and professionals in the field are working to improve and advance AI, there is evidence that some people have deliberately programmed false data into AI systems. For example, AI systems working with social media recommend content to users based on the users’ perceived interests. False data fed to the AI system causes the system to recommend brand specific content with the goal of increasing sales and profits. *****

Various tests have found AI can and will distribute false information and even fake evidence. When asked to support information that when fact checked was proven false, AI systems would invent case law that didn’t exist.  Presently, the Florida Bar cautions attorneys in the responsible use of AI in their practice given the unreliability and lack of current laws regarding the same. Although AI will likely assist the legal profession, AI and how the legal field will evolve in response is still mostly unknown.

Estate planning is a personal experience. AI cannot offer you a compassionate human experience that will leave you feeling comfortable and confident in your legal planning. As attorneys, we take the time to listen and get to know you so that we can create a plan that protects you and your future. We explain your options and discuss the pros and cons so that you can make informed decisions. There are often various ways to accomplish your goals to save you and your loved one’s time and expense in the future. Only an attorney can accurately advise you and ensure that your documents are legally sound. We hope you will contact our office for a consultation to learn more about our services. 

No AI systems or chat bots were used in the writing of this article.

May 8, 2023

Previous
Previous

Probate: Is it Worth it?