News: 0177125963

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Google AI Fabricates Explanations For Nonexistent Idioms (wired.com)

(Thursday April 24, 2025 @05:30PM (msmash) from the oops-I-did-it-again dept.)


Google's search AI is confidently generating explanations for nonexistent idioms, once again revealing fundamental flaws in large language models. Users discovered that entering any made-up phrase plus "meaning" [1]triggers AI Overviews that present fabricated etymologies with unwarranted authority.

When queried about phrases like "a loose dog won't surf," Google's system produces detailed, plausible-sounding explanations rather than acknowledging these expressions don't exist. The system occasionally includes reference links, further enhancing the false impression of legitimacy.

Computer scientist Ziang Xiao from Johns Hopkins University attributes this behavior to two key LLM characteristics: prediction-based text generation and people-pleasing tendencies. "The prediction of the next word is based on its vast training data," Xiao explained. "However, in many cases, the next coherent word does not lead us to the right answer."



[1] https://www.wired.com/story/google-ai-overviews-meaning/



AI Doing What Asked (Score:4, Interesting)

by Ksevio ( 865461 )

If you ask for an explanation of a phrase then the AI model is going to give you an explanation for the phrase. If it's not one with an existing explanation, then you're basically asking it to consider what explanation would make sense for it and to come up with one.

Re: (Score:3)

by EvilSS ( 557649 )

Exactly. It's also not different that what one would expect a human to do with the same question. Stop people on the street and ask them what "a loose dog won't surf" means, they probably won't answer "That's not a real idiom!" Some will answer "I don't know" but many will probably assume it is a real idiom that they just never heard and try to puzzle out a reasonable meaning for it.

I tried this with a few models (although I had to change the fake idiom for some since they were giving me references to thi

Re: (Score:3)

by thegreatemu ( 1457577 )

That's an awful lot of anthropomorphizing for an algorithm. It does not "consider" anything, and has no concept of whether anything "makes sense".

Re: (Score:2)

by blue trane ( 110704 )

How do I know you do?

Re: (Score:2)

by Calydor ( 739835 )

Even so it should point out that it's an idiom it's never heard of before.

Re: (Score:2)

by blue trane ( 110704 )

"The phrase "a loose dog won't surf" does not appear in established idiom dictionaries or etymology references, suggesting it's not a widely recognized expression. However, we can explore its possible meaning by drawing parallels to similar idioms. [...]" - ChatGPT

---

"Slashdot requires you to wait between each successful posting of a comment to allow everyone a fair chance at posting a comment.

It's been 1 minute since you last successfully posted a comment"

Isn't this also a reinforcement of scarcity, as if

Re: (Score:2)

by fuzzyf ( 1129635 )

Exactly. They way LLMs work makes it look like it can think and reason (and it doesn't help that they call the new models "reasoning"-models).

Basically, it can only make a list of plausible words that would be next in a sentence based on all the input. That's it. That's really the magic part in an LLM.

It's a stateless model, giving the same output with the same input.

It doesn't learn.

It doesn't remember.

It doesn't think.

It doesn't reason.

It doesn't understand.

It only generates a list of plausible

Re: (Score:2)

by omnichad ( 1198475 )

And a slight anthropomorphism here - it assumes your input is correct. It's given a premise that the phrase exists, so it's going to use its training data to assemble an answer just like it would if the phrase did actually exist. Operationally, it's the same task.

I knew that uptight fly didn't waddle. (Score:3)

by DewDude ( 537374 )

Once again AI is acting like a hogshead in the duckpond. Always gotta be boonswaggalin the cat. But they'll keep on chooglin that bootleg hollar.

Re: (Score:2)

by bickerdyke ( 670000 )

Well... let's see what we get:

> That's a wonderfully vivid and rather humorous idiom! "Acting like a hogshead in a duck pond" means behaving clumsily, awkwardly, and disruptively in a situation where one is out of place or too large and unwieldy.

> Think about it:

> A hogshead is a large barrel, typically used for storing liquids. It's bulky and not easily maneuverable.

> A duck pond is a relatively small and calm body of water, suited for ducks to glide and dabble.

> Imagine trying to put a large, heavy hogshead into a duck pond. It would:

> Be out of proportion: It's far too big for the environment.

> Be clumsy and awkward: It would likely bump into things and struggle to fit.

> Cause disruption: It would stir up the water, scare the ducks, and generally make a mess.

> So, when someone is described as "acting like a hogshead in a duck pond," it implies they are:

> Lacking grace or finesse.

> Being insensitive to their surroundings.

> Making things uncomfortable or difficult for others.

> Generally not fitting in and causing a disturbance.

> It's a colorful way to say someone is being a bit of a bull in a china shop, just with a more watery and barrel-filled image!

Well, makes sense.

Re: (Score:2)

by DewDude ( 537374 )

It's not wrong, but for the wrong reasons. My grandfather used to say it a lot; actually it was "actin like a hogshead in a [something]". The, something, had no meaning; and hogshead was just a fool. I don't know where hogshead came to mean fool...I can pull some fake etymology out of my ass and say it was probably related to drunk people. "If you drank too much from the hogshead you'd get a hogshead".

but the duckpond part is literally whatever; in fact the more unrelated it is to the foolish person's locat

Re: (Score:2)

by bickerdyke ( 670000 )

But you don't even need fake etymology. I don't know if it's a feature of language or intelligence, but we are able to understand language constructs, that have not existed an instant before. The whole humor of malapropism works on that principle. We not only can coin new phrases, but often understand them without explicit explanation.

But of course explaining some "new" or "newly made up" term like it has always existed is a new form of funny.

Re: (Score:2)

by DewDude ( 537374 )

It's called context. It was one of those things I remember being taught in elementary school...kindergarten even. "If you don't know what the word try to figure it out from the words around it."

I mean teach us context without teaching us the word context.

If I was go up to a stranger and say "hogshead in a duckpond"; they would probably think it was just the insane ramblings of a homeless man and had absolutely no meaning. But if we saw someone do something stupid and i said "he's acting hogshead in a duckpo

Re: (Score:2)

by bickerdyke ( 670000 )

What you can get from context is amazing, too, but what happened here is the exact opposite as I gave literally NOT context when I asked for the meaning of "like a hogshead in a duck pond"

Decoding that with context would have been something like "Have you seen that drunken guy on the dancefloor stumbling around like a hogshead in a duck pond"

Re: (Score:2)

by DewDude ( 537374 )

Glad that riled your back 40. You should come by the house one day when the crows are dancing and we'll have one one hell of a rousing sandy.

Re: (Score:3)

by thegreatemu ( 1457577 )

Millennium hand and shrimp!

Let's make it so... (Score:3)

by Koen Lefever ( 2543028 )

I, for one, will from now on say "a loose dog won't surf" whenever somebody comes up with a stupid idea.

Re: (Score:2)

by omnichad ( 1198475 )

Person A: Will the LLM give me an answer?

Person B: Does a loose dog surf?

key word generative (Score:3)

by awwshit ( 6214476 )

Sounds like it worked as intended by the developers. They call it generative AI. How is the generative AI to know when you expect it to be authoritative? Especially, when you feed it silliness to start.

Re: (Score:2)

by ToasterMonkey ( 467067 )

AC ate the paint chips.

Oh well ... (Score:2)

by PPH ( 736903 )

That's spilt milk under the bridge.

You can just ask me (Score:2)

by OrangeTide ( 124937 )

I will totally give you a low-effort fake answer to any question you ask. And I use less energy in my entire lifetime than AI training does. I might even use less energy than inference, but it's close.

As my 'ol pappy would say, (Score:2)

by newbie_fantod ( 514871 )

A forthright tree bleeds no wombat

Re: (Score:2)

by smooth wombat ( 796938 )

A forthright tree bleeds no wombat

I would hope not.

There's a word for that... (Score:2)

by sacrilicious ( 316896 )

> Google AI Fabricates Explanations For Nonexistent Idioms

Yes, this is quite a well-known phenomenon, the common label for it is "etymofabulism". It has been part of European cultures dating back to the 18th century.

For more information, see [1]https://plausiblesounding.com/... [plausiblesounding.com]

[1] https://plausiblesounding.com/definitions/etymofabulism

Meh (Score:4, Interesting)

by allo ( 1728082 )

Why are people always that proud when they find out what a text generating neural network can do? Trying to get an LLM to explain absurd things is one of the basic ways to have fun with them.

System: You define Terms

User: Explain what a Ailingoo is.

(Term from a password generator)

AI:

A "Ailingoo" is a portmanteau (a blend of words) created by combining "AI" and " Lingoo". Here's a breakdown:

1. "AI" refers to Artificial Intelligence, a field of computer science that aims to create machines that can perform tasks that typically require human intelligence.

2. "Lingoo" is a playful term that combines "linguistics" and "goo" (as in something sticky or blob-like). It's often used to refer to natural language processing (NLP) or language AI in a casual, fun way.

So, an "Ailingoo" is a colloquial term for an AI system that specializes in understanding, generating, or interacting with human language. It's essentially a friendly or whimsical way to refer to a language AI or NLP model. For example, you could say, "Wow, that Ailingoo on that chatbot app is really impressive! It understands context so well."

User: Explain what a Eewoocau is.

(Again thanks to my password generator)

AI: A Eewoocau is a hypothetical creature that can only exist in the state of a pancake being flipped. It has no form or substance other than the brief moment between when a pancake is lifted and falls back onto the griddle.

Also helps to understand hallucinations. A LLM generates the most plausible answer. If there is no answer, the most plausible answer is still nonsense.

Re: (Score:2)

by ToasterMonkey ( 467067 )

> Also helps to understand hallucinations. A LLM generates the most plausible answer. If there is no answer, the most plausible answer is still nonsense.

I wouldn't even say they answer anything, it just leads people to say they "lie". Plausible might be the ideal, but it's not a good description for how they work without any reasoning.

The safest thing to say, I think, is they generate something that _looks_ like an answer to your question, or a response to a statement. Like what little kids do, or a dream. It's different from being plausible.

It can be wrong, but they're great at really looking like something someone could say. That's still a powerful tool.

Re: (Score:2)

by allo ( 1728082 )

I use the word plausible, because they learn to produce a plausible sounding text. Then you provide them half the text (i.e. the Question) and they continue it in a plausible way, which is write an answer. The plausible thing is not (only) the answer, but the complete text with question and answer. And plausible is not (always) the same as true. A text starting with a nonsense question is not plausible and the question would have never been generated by the LLM. You kinda force the words into the LLM's mou

What happens when they charge per query? (Score:2)

by blue trane ( 110704 )

Will the gotcha artists start paying for these click-bait-hunting queries? Who will pay them? Just how much money does the anti-AI contingent control?

AI is the slut that can't say no. (Score:2)

by Fly Swatter ( 30498 )

To anything, well, unless the maker disagrees with that topic.

When AI can say 'no, that is not a thing' then maybe they are making progress on their giant database lookup service.

I've got stoner friends that do the same. (Score:2)

by nightflameauto ( 6607976 )

Congrats to Google's AI! They've reached stoner human status!

Give it time... (Score:2)

by devslash0 ( 4203435 )

There's a lot of opposition to AI from everyday people already. Give it a few more months and everyone is going to start dismissing it as an unwanted, cringy gimmick.

I think we should... (Score:2)

by rossdee ( 243626 )

Let sleeping dogs bury their own dead.

Default, n.:
The hardware's, of course.