AI

While optimistic, this CU professor warns lawyers be careful when it comes to AI


For years, University of Colorado Law School professor Harry Surden has had a finger on the pulse of how artificial intelligence systems are used in legal offices and courtrooms.

Surden and others in Boulder County said in the past two years artificial intelligence system usage has grown immensely in both legal and non-legal arenas, leading legal professionals to consider how the technology could complement their work.

“Lawyers are using AI for various activities for very basic things, you know, categorizing different legal documents to more advanced things like legal analysis, helping them solve legal problems, and that’s sort of a promising area,” Surden said.

Chief Deputy District Attorney Christian Gardner-Wood said the Boulder District Attorney’s Office is not using OpenAI’s ChatGPT or Microsoft’s Copilot, but uses online programs such as Westlaw and Evidence.com that either have new AI features or plan to implement them soon.

“The information (artificial intelligence systems) may be relying on isn’t necessarily valid information,” Gardner-Wood said. “A good example of that, is, there have been lawyers in the United States that have been sanctioned by court for using AI to help them draft motions or draft briefs, and it’s actually resulted in them inadvertently citing fake case law.”

Along with his work at CU Boulder, Surden is also the associate director of CodeX, a Stanford University center focused on researching legal informatics and bridging the gap between computer science and law. While Surden doesn’t have expertise in criminal law, he said Boulder County prosecutors are right to hold off on using OpenAI systems such as ChatGPT.

Gardner-Wood said prosecutors have access to an artificial intelligence function on Westlaw, a legal research application used by lawyers and legal professionals.

“The nice thing about that AI, is, it’s using a known database,” Gardner-Wood said. “It’s not bringing in confidential information, it’s not extraneous information. It’s still using large language models to understand the language and to answer questions, but it’s going directly into the Westlaw database.”

Gardner-Wood said Microsoft is set to release a version of Copilot specifically made for government officials which, according to Microsoft, “streamlines legal research processes, enabling lawyers to access vast repositories of case law.”

“The benefit of that, is, it’s going to use the large language model that has been for AI, but instead of directing it to just open source data and the entire internet if you will, you can use that AI within your own internal sources,” Gardener-Wood said.

Gardner-Wood said that artificial intelligence could be used as a virtual assistant on the district attorney’s website, or used by lawyers to search for files or information using common language.

Surden said he is optimistic for people to use artificial intelligence more often, but he is cautious of it being used well and correctly.

“I’m still optimistic, but now it’s here and accessible, we have to be very careful to use it well and be literate in its limits,” Surden said.

AI in policing can be controversial

Surden said in policing, artificial intelligence use can be controversial, since some artificial intelligence experts have said they doubt police have the adequate technical background needed to use it.

“As far as I’m aware, police are mainly using it in facial recognition for potential suspects or using it for license plate recognition,” Surden said. “There’s some use of it for predictive policing, trying to predict where crimes are likely to happen.”

Surden continued, “They’re often making mistakes and often these have some sort of bias against underrepresented groups. That can be really problematic.”

Boulder Police Department spokeswoman Dionne Waugh said the department is currently involved in a pilot program of Draft One, an artificial intelligence system designed to help officers limit the time they spend writing reports. Waugh said it’s not “traditional AI” and that the department doesn’t use artificial intelligence in any other aspects of its policing.

According to the Draft One website, Axon body cameras are equipped with artificial intelligence that transcribes body camera audio into text. Several other police departments around Boulder County and on the Front Range use the Axon brand body cameras, too, including Lafayette and Longmont.

Boulder County Sheriff’s Office spokeswoman Carrie Haverfield said the office currently uses artificial intelligence technology to search for possible wildfires.

Surden said that people shouldn’t assume that legal work solely based on human judgment is necessarily better than work supported by artificial intelligence systems.

“The critiques of AI, and they are very valid, is that they have biases and make mistakes, but the status quo is that we use humans who also have biases and make mistakes,” Surden said. “So we shouldn’t assume that AI is inherently worse or better than humans. Ultimately, people like me want a safer, more just legal system that treats people fairly. I think if done carefully, AI can do this, with the caveat being, carefully and with proper literacy and resources.”

While Surden said to not trust anyone who says they can predict the future of artificial intelligence systems, he said he believes such systems will steadily get better.

“In a year or two, I don’t think we’ll be in the same situation where judges and lawyers can get it wrong as easily,” Surden said. “There will be more safeguards to nudge people to use it correctly.”

But for now, Surden said, “It’s kind of like the early days of the internet. There’s a lot of unreliability. Law is very cautious about change. Peoples’ liberty and property depends on getting things right.”

View more on
Boulder Daily Camera







Source

Related Articles

Back to top button