cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
270
Views
0
Helpful
1
Replies

Word of the Week: Hallucination

Ken W. Alger
Cisco Employee
Cisco Employee

What does one think of when you hear the word hallucination? There are different answers, but one medical definition is a false perception of objects or events involving your senses. Something that seems real but isn't. In the human world, these can happen to us occasionally.

Now, in the Word of the Week context, I'm thinking of hallucination as it relates to Artificial Intelligence. Like the medical definition, a hallucination from an AI response might seem real, but it isn't. It's an incorrect response coming from an AI system. Output that seems factual but isn't.

Sometimes, this might be obvious and harmless, but these AI hallucinations can also happen and be problematic. For example, in June of 2023 a U.S. federal judge found that a couple of attorneys had used ChatGPT to produce part of (most???) of their legal brief. The problem occurred when ChatGPT hallucinated some of the legal history used in the brief. That mistake resulted in a $5,000 fine for the attorneys.

AI models are improving daily, and we may not rely on AI to formulate our legal documents. But I think fact-checking AI output will still be necessary for a while. This cross-checking of information is important to ensure its accuracy.

1 Reply 1

liviu.gheorghe
Spotlight
Spotlight

That's a nice one Ken. Network engineers use ChatGPT more and more everyday... why not attorneys...

Regards, LG
*** Please Rate All Helpful Responses ***