Why would the steps be literal when everything else is bullshit? Obviously the reasoning steps are AI slop too.
no no it’s LYING
The paper clipping is nigh! Repent Harlequins
It’s bullshitting. That’s the word. Bullshitting is saying things without a care for how true they are.
The word “bullshitting” implies a clarity of purpose I don’t want to attribute to AI.
It’s kind of a distinction without much discriminatory power: LLMs are a tool created to ease the task of bullshitting; used to produce bullshit by bullshitters.
Yeah that is why people called it confabulating, and not bullshitting.
It re consumes its own bullshit, and the bullshit it does print is the bullshit it also fed itself, its not lying about that. Of course, it is also always re consuming the initial prompt too so the end bullshit isn’t necessarily quite as far removed from the question as the length would indicate.
Where it gets deceptive is when it knows an answer to the problem, but it constructs some bullshit for the purpose of making you believe that it solved the problem on its own. The only way to tell the difference is to ask it something simpler that it doesn’t know the answer to, and watch it bullshit in circles or to an incorrect answer.