I just tested it on Bing too, for shits and giggles
you can't butter the whole world's bread meaning
The phrase "you can't butter the whole world's bread" means that one cannot have everything
This is a most excellent place for technology news and articles.
I just tested it on Bing too, for shits and giggles
you can't butter the whole world's bread meaning
The phrase "you can't butter the whole world's bread" means that one cannot have everything
Didn't work for me. A lot of these 'gotcha' AI moments seem to only work for a small percentage of users, before being noticed and fixed. Not including the more frequent examples that are just outright lies, but get upvoted anyway because 'AI bad'
It looks like incognito and adding "meaning AI" really gets it to work just about every time for me
However, "the lost dog can't lay shingles meaning" didn't work with or without "AI", and "the lost dog can't lay tiles meaning" only worked when adding "AI" to the end
So it's a gamble on how gibberish you can make it I guess
I found that trying "some-nonsense-phrase meaning" won't always trigger the idiom interpretation, but you can often change it to something more saying-like.
I also found that trying in incognito mode had better results, so perhaps it's also affected by your settings. Maybe it's regional as well, or based on your search result. And, as AI's non-deterministic, you can't expect it to always work.
Now I'll never know what people mean when they say "those cupcakes won't fill a sauna"!
That is a fascinating take on the general reaction to LLMs. Thanks for posting this!
Tried it. Afraid this didn't happen, and the AI was very clear the phrase is unknown. Maybe I did it wrong or something?
It didn't work for me. Why not?
Worked for me, but I couldn’t include any names or swearing.
Honestly, I’m kind of impressed it’s able to analyze seemingly random phrases like that. It means its thinking and not just regurgitating facts. Because someday, such a phrase could exist in the future and AI wouldn’t need to wait for it to become mainstream.
It's not thinking. It's just spicy autocomplete; having ingested most of the web, it "knows" that what follows a question about the meaning of a phrase is usually the definition and etymology of that phrase; there aren't many examples online of anyone asking for the definition of a phrase and being told "that doesn't exist, it's not a real thing." So it does some frequency analysis (actually it's probably more correct to say that it is frequency analysis) and decides what the most likely words to come after your question are, based on everything it's been trained on.
But it doesn't actually know or think anything. It just keeps giving you the next expected word until it meets its parameters.
I mean are you asking it if there is a history of an idiom existing or just what the idiom could mean?
One arm hair in the hand is better than two in the bush