I’m looking to get my first subscription of a machine learning model and I’ve been using POE for a while but I’m not sure if paying for it would be better than paying for a GPT subscription. I almost never use them to generate images, mostly for help with my business and some programming.
I also want my wife to be able to use the same account when I start paying for it.
I’m not sure what the benefits of each are and which would outweigh.
I’ve run some local llms (3060, 12g vram), and I daily generate images locally (wouldn’t pay for that), but I do pay for a chatgpt subscription. I think it’s worth it for my purposes. Responses are way faster and higher quality than any local model I’ve tried, web search integration, image recognition, mobile app seamless, I use all of those features regularly. Unfortunately I’ve never ver used POE so I can’t compare, sorry.
It depends on what you’re looking for in an AI tool. Here’s a comparison to help you decide: GPT (OpenAI-based tools like ChatGPT Plus):
Customization: Depending on the version, GPT allows deeper customization and flexibility with various use cases, whether it's for writing, coding, brainstorming, etc. Advanced Features: GPT versions like GPT-4 are generally more powerful for tasks that require deeper reasoning, understanding of complex topics, or handling large sets of data. Updates and Reliability: OpenAI tools are directly updated and managed by the team behind GPT, ensuring you get the latest and most optimized AI model.
Poe (Quora’s AI platform):
Multi-Model Access: Poe provides access to multiple AI models, so you can use different engines for different tasks (e.g., GPT, Claude). Convenience: Poe is designed to be user-friendly and lightweight for quick questions or simple tasks, but may not offer the same depth of customization and control as GPT subscriptions. Community Integration: If you're active on Quora, Poe’s integration might be beneficial as it connects well with the question-answering format.
Which to choose?
If you need power and customization for more in-depth tasks (like teaching, coding, or research), a GPT subscription might be a better choice. If you want quick, convenient access to various models without too much complexity, Poe could be more suitable.
Let me know if you want more detailed features on either!
Don’t get chatgpt plus, just get an API token and use one of the desktop apps/CLIs, it’s pay as you go and way cheaper unless you’re using gpt 4 all day every day or something
Do you have an example for a desktop app that would use these tokens?
I don’t, you’d have to have a Google
I use gpt-cli which is pretty good if you’re ok with using a terminal https://github.com/kharvd/gpt-cli
Maybe check out Kagi’s ultimate tier. They let you swap between some of the different options to see which you might find useful. As a bonus you also get kagi search which can be useful.
Claude.ai is quite a bit superior to GPT in my experience. That one, I pay for, and it seems like it’s worth it.
Thanks but why would you say it’s superior to GPTo1?
I haven’t played around with GPT o1; I just checked, and I don’t have access. I’m not saying it’s necessarily bad without having experienced it. But OpenAI has been getting steadily worse for a while, so I’m assuming that the stuff I’ve interacted with is indicative of the quality of the new stuff. It’s all of a piece.
FWIW I only ever used those services if they accepted a prepaid credit card. OpenAI didn’t accept prepaid cards when I tried, not sure about Poe. Just something to think about.
I canceled my ChatGPT subscription a month or two ago. It just got completely unreliable. Like someone else said, Claude is way better but they’re both disappointing at this point. I only subscribed to Claude like last week to help solve an incredibly last minute thing. Not sure I’m going to stay subscribed.
Try Ollama. No payment required.
Honestly, all of the generative AI subscriptions are pretty fucking steep at this point compared to just running a model locally.
I agree with this. I’m using a 1070ti for image gen and it would be more than capable for handling some LLM stuff. An AMD 7700xt ive found dors well with 7B models on my main rig but im sure you could get away with somthing cheaper or less powerful.
That said, the amount of text you can genrate or the context length of its answers will depend the model you use and the larger the model, the more power it takes.
If youre just messing around with it or want it to review or answer small questions, I’d say a 1070ti like I’m using would be just fine. Some folks use even more budget friendly options. If you got a gaming machine with any semi recent GPU, I’d say go for it. Worst case, you can pay for a subscription later if you really want.
Download gpt4all and you can get an open source model that performs basically as good as any of the paid ones.
Thanks, I’ve done just that and installed it too! What’s the best gpt4all LLM model or the model you’d recommend?
Llama is a solid choice. Or mistral. I use moistral which was made for porn but it’s pretty uncensored in general. Doesn’t have qualms about ethics or illegalities.
Does it mean Llama does have that? And how does that affect the performance? I mean the thing about “no qualms about ethics”