• vivendi@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    6
    ·
    7 hours ago

    You need to run the model yourself and heavily tune the inference, which is why you haven’t heard from it because most people think using shitGPT is all there is with LLMs. How many people even have the hardware to do so anyway?

    I run my own local models with my own inference, which really helps. There are online communities you can join (won’t link bcz Reddit) where you can learn how to do it too, no need to take my word for it

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      ah yes, the problem with cryptoLLMs is all the shitcoinsGPTs

      did it sting when the crypto bubble popped? is that what made you like this?