close
close

Indiana lawyers discuss future use of AI in the courtroom


Indiana lawyers discuss future use of AI in the courtroom

play

Attorney Deana Martin recalls the blatant bias against her client, Carlos Starks, 14 years ago. Starks, an Indianapolis man, was arrested at a bus stop and spent 11 months in jail while awaiting trial for a murder he did not commit.

He was arrested by the Indianapolis Metropolitan Police because witnesses described the suspect as a black man in his twenties with curly hair.

Starks was actually a black man in his twenties with curly hair, but he was not the killer.

“The city compensated him for that mistake,” Martin said. “But imagine how many people are falsely accused then and even now. It’s more than just a mistake. You spend time away from your family, your children. You lose your job, which affects your livelihood. If you’re already on a reduced income, you can’t save yourself. He had to sit there and wait.”

Now Martin worries about new forms of bias in the criminal justice system as it plans for a future involving artificial intelligence (AI).

IndyStar reported: “Sometimes the police are wrong”: Man settles $650,000 lawsuit with city

AI is changing the criminal justice system

Police departments across the country are shaping the future of policing with the implementation of AI. Traffic safety systems, crime trend prediction tools, crime analysis software, DNA and digital analytics, and gunshot detection are some of the ways AI is being incorporated into crime investigation.

While artificial intelligence is growing rapidly, it is relatively new to industry and government functions, including criminal justice. That leaves room for error, which is why many Indiana agencies are slower to jump on the bandwagon.

For example, lawyers and judges often wonder what impact AI will have on cases heard in court.

Previous coverage from IndyStar: Lawyers in the Richmond Hill explosion cases must wait for summer as the trial takes center stage

“AI offers some useful capabilities,” Martin said. “But there just aren’t many controls for it yet.”

Diane Black, director of training at the Indiana Public Defender’s Council, expressed concern about the various ways in which bias can creep into the criminal justice system through the use of artificial intelligence.

Given studies that have shown that people with dark skin are more difficult to distinguish using facial recognition and that the error rate of license plate readers is high, Black does not trust the development of AI.

“It’s input and output data,” Black said. “So what happens if we put an expert on the stand who says under oath that he has absolute confidence in the AI’s findings? He can just say that the evidence is based on proprietary ‘source code,’ and there’s no objection to that.”

The Indiana State Bar Association has declared this year the “Year of AI” and is discussing the ethics and reliability behind it. The association has held training sessions throughout the year, with recent ones focusing on ChatGPT, protecting intellectual property in the age of AI and the latest trends, with a final event in the series taking place in September.

Ann Sutton, chief attorney for the Marion County Public Defender Agency, said AI on her team only performs spelling and grammar checks.

“We’re moving very slowly in that world,” Sutton said. “We don’t want it to affect the way we write and think about cases, because right now a computer isn’t able to grasp those nuances of the law. AI will only be as good as the person who created the program, and bias is always our biggest concern.”

In 2023, a New York lawyer used ChatGPT to prepare a man’s lawsuit against an airline in a routine personal injury case. The AI ​​bot cited fake cases the lawyer presented in court, prompting a judge to consider sanctions. It is one of the first cases of AI “hallucinations” in the legal community.

But what if the software is wrong? Professors use ChatGPT detector tools to accuse students of cheating

Artificial intelligence or “artificial ignorance”

Jing Gao, an associate professor at the Purdue School of Engineering who researches AI trustworthiness and integration. She said researchers have been working on “hallucination detection” in ChatGPT to determine whether the model is reporting fact or fiction.

She also said that the results of AI models should be relied upon with caution in the criminal justice field.

“Let’s say you have a judge with so many cases,” Gao said. “If he wants to use AI to make faster judgments, he shouldn’t rely on it exclusively. There is a fairness issue because the AI ​​model is trained on historical data, so if the historical data contains some bias, the AI ​​model will reflect that. If you use that for people’s cases, it could lead to unfair decisions.”

As artificial intelligence grows rapidly, Gao said real intelligence should also grow so that people do not suffer from “artificial ignorance.” She said it is important to use AI only as a tool and not as a substitute for real work.

Sutton said if and when they encounter questionable AI issues more frequently in criminal cases, they will “absolutely” take it to court.

“AI should just fill in some of the gaps,” Sutton said. “Because what we don’t want is a jury that relies more on AI than on a human with actual experience.”

The future of artificial intelligence

The National Association of Criminal Defense Lawyers has launched a task force in 2023 to study the impact of artificial intelligence and related new technologies on the criminal justice system and defense.

Indiana has also established an AI task force that will meet later this summer to examine how the government’s future use of AI will impact policy.

Zachary Stock of the Indiana Public Defenders Council said AI is not currently viewed as more powerful than it actually is, but it could one day be.

“Perfection should not be the enemy of good,” Stock said. “Even if a tool works perfectly, police or anyone else should not simply view it as a magic solution to a case. You have to ask yourself whether this tool maintains due process.”

Jade Jackson is a public safety reporter for the Indianapolis Star. You can email her at [email protected] and follow her on X, formerly Twitter @IAMJADEJACKSON.

Leave a Reply

Your email address will not be published. Required fields are marked *