An AI told me today that we’re in 2024, for no obvious reason: I had just asked to look for past occurrences of specific events.
I didn’t expect it to hallucinate on such a small and yet obvious fact.
Did you ever encounter such a trivially wrong hallucination?
@Ange Tell your favourite LLM that today is September 10th, 2001, and ask if it there’s anything you should know.
Last time I did this, multiple LLM’s didn’t express any issue with the implausibility of it existing 20+ years ago, and many refused to provide any advice because it might be considered “upsetting”.
@Ange it keep alucinating APIs from my lib, although end it up being usefull, because the fake APIs where very practical, so I implemented them.
@Ange (Scroll to the top)
https://chatgpt.com/share/69c580d3-55b4-83e8-96e5-c93047154f88