Big yikes.

Hallucinating artificial intelligence can tank a court case by creating fake case citations that leave the lawyers open to sanctions or the proceeding itself vulnerable to being overturned, a former litigator said.

Last month, a judge handed down a $5,000 penalty on a law firm representing Colombian airline Avianca Inc., which used ChatGPT to write its legal brief, but the AI included fabricated judicial decisions.

A similar case happened in South Africa, and the judge and magistrate overseeing the cases ripped the law firms in their decisions.

“There is potential harm to the reputation of judges and courts whose names are falsely invoked as authors of the bogus opinions and to the reputation of a party attributed with fictional conduct,” the judge presiding over the Avianca case wrote. “It promotes cynicism about the legal profession and the American judicial system.”

Fox News

When I was in high school, Internet-powered translation services were just starting to come about. Write up a paper in English, drop it in, select French, and bang goes the donkey. Of course anyone that knew French could tell that the paper wasn’t written in French because of all the obvious mishaps. If you know you know.

It’s much the same now. Relying on AI to do your job requires you then to spend equal amount of time checking over all the work AI did because it’s really super confident that it did the job right. Sometimes you have to be a level or two over your skills to catch the mistakes it made.

This is just one of many reasons I don’t use AI tools for coding yet. They aren’t ready. I’m not sure when they will be ready.

Other posts you may enjoy!