Conversation

An AI told me today that we’re in 2024, for no obvious reason: I had just asked to look for past occurrences of specific events.

I didn’t expect it to hallucinate on such a small and yet obvious fact.

Did you ever encounter such a trivially wrong hallucination?

4
1
0
@Ange as they dont have a concept of time this kind of info is usually included automatically in the system prompt. Iirc claude code uses very strict words to make inference stick to the given value, but stochastic parrots are stochastic...
0
0
1

@Ange Tell your favourite LLM that today is September 10th, 2001, and ask if it there’s anything you should know.

Last time I did this, multiple LLM’s didn’t express any issue with the implausibility of it existing 20+ years ago, and many refused to provide any advice because it might be considered “upsetting”.

0
1
0

@Ange it keep alucinating APIs from my lib, although end it up being usefull, because the fake APIs where very practical, so I implemented them.

0
1
0