• circuscritic@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Yes and no, I have self-hosted models on one of my Linux boxes, but even with a relatively modern 70 series Nvidia GPU, it’s still faster to use free non-local services like ChatGPT or DDG.

      My rule of thumb for SaaS LLMs is to never enter in any data that I wouldn’t also be willing to upload cleartext to Google Drive or OneDrive.

      Sometimes that means modifying text before submitting it, and other times having to rely entirely on self-hosted tools.