It takes a lot of skill and knowledge to recognise a wrong answer that is phrased like a correct answer. Humans are absolutely terrible at this skill, it’s why con artists are so succesful.
And that skill and knowledge is not formed by using LLMs
But you have the tech literacy to know that. Most non-tech people that use it do not, and just blindly trust it, because the world is not used to the concept that the computer is deceiving them.
That’s the thing. It’s a tool like any other. People who just give it a 5 word prompt and then use the raw output are doing it wrong.
It takes a lot of skill and knowledge to recognise a wrong answer that is phrased like a correct answer. Humans are absolutely terrible at this skill, it’s why con artists are so succesful.
And that skill and knowledge is not formed by using LLMs
Absolutely.
And you can’t learn to build a fence by looking at a hammer.
My point all over really. Tools and skills develop together and need to be seen in context.
People, whether for or against, who describe AI or other tool in isolation, who ignore detail and nuance, are not helpful or informative.
But you have the tech literacy to know that. Most non-tech people that use it do not, and just blindly trust it, because the world is not used to the concept that the computer is deceiving them.