

Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.
Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.
I remember talking to someone about where LLMs are and aren’t useful. I pointed out that LLMs would be absolutely worthless for me as my work mostly consists of interacting with company-internal APIs, which the LLM obviously hasn’t been trained on.
The other person insisted that that is exactly what LLMs are great at. They wouldn’t explain how exactly the LLM was supposed to know how my company’s internal software, which is a trade secret, is structured.
But hey, I figured I’d give it a go. So I fired up a local Llama 3.1 instance and asked it how to set up a local copy of ASDIS, one such internal system (name and details changed to protect the innocent). And Llama did give me instructions… on how to write the American States Data Information System, a Python frontend for a single MySQL table containing basic information about the member states of the USA.
Oddly enough, that’s not what my company’s ASDIS is. It’s almost as if the LLM had no idea what I was talking about. Words fail to express my surprise at this turn of events.
Run a fairly large LLM on your CPU so you can get the finest of questionable problem solving at a speed fast enough to be workable but slow enough to be highly annoying.
This has the added benefit of filling dozens of gigabytes of storage that you probably didn’t know what to do with anyway.
I know I sorted by feed by Top 6 Hours but that doesn’t mean I expect six hours worth of text in a single image. Did they copy and paste three different job postings together? Did they use a LLM that had its stop token configured incorrectly? Is it an attempt at weeding out people who object to having their time wasted by corporate bullshit?
We may never know. What we do know is that this wall of text has more red flags than a Chinese military parade.