Conversation
I recently saw a lot of examples of using LLMs when you could get away e.g. with and API call, but now I think I found the perfect example of LLMs being more niche then even skeptics (like myself) think they are:

Even skeptics have to admit that LLMs are very good at natural language translation. @kagihq introduced a fast ("Standard") #LLM for its translation service that seems to fail miserably if you try to translate single words for less common languages:

https://kagifeedback.org/d/9373-standard-translation-is-unusable-for-hungarian

My point is that doing a dictionary lookup for all words (~1mil for English) could be done on a disposable vape in no time with better results, incl. clear indication of lookup failures, so you can fall back to your GPUs when needed.
0
1
0