Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn’t know the answer, it would have been trustworthy.

  • Puttaneska@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    It seems that ChatGPT does sometimes know that what it’s offered is wrong and actually knows a better answer when challenged.

    I’ve often asked for code help, which hasn’t worked. Then I’ve gone to other sources and found that ChatGPT has been wrong about something and there’s an alternative way. When this is put back to ChatGPT, it says that I’m correct (x can’t do y) and offers a perfect solution.

    So it looks like it does sometimes know what it appears to not know, but inexplicably doesn’t give the correct info immediately.