I’m looking to get my first subscription of a machine learning model and I’ve been using POE for a while but I’m not sure if paying for it would be better than paying for a GPT subscription. I almost never use them to generate images, mostly for help with my business and some programming.

I also want my wife to be able to use the same account when I start paying for it.

I’m not sure what the benefits of each are and which would outweigh.

  • xmunk
    link
    fedilink
    arrow-up
    34
    arrow-down
    1
    ·
    1 month ago

    Honestly, all of the generative AI subscriptions are pretty fucking steep at this point compared to just running a model locally.

    • Bluefruit@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      1 month ago

      I agree with this. I’m using a 1070ti for image gen and it would be more than capable for handling some LLM stuff. An AMD 7700xt ive found dors well with 7B models on my main rig but im sure you could get away with somthing cheaper or less powerful.

      That said, the amount of text you can genrate or the context length of its answers will depend the model you use and the larger the model, the more power it takes.

      If youre just messing around with it or want it to review or answer small questions, I’d say a 1070ti like I’m using would be just fine. Some folks use even more budget friendly options. If you got a gaming machine with any semi recent GPU, I’d say go for it. Worst case, you can pay for a subscription later if you really want.

  • PerogiBoi@lemmy.ca
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    1 month ago

    Download gpt4all and you can get an open source model that performs basically as good as any of the paid ones.

    • SurpriZe@lemm.eeOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      Thanks, I’ve done just that and installed it too! What’s the best gpt4all LLM model or the model you’d recommend?

      • PerogiBoi@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        Llama is a solid choice. Or mistral. I use moistral which was made for porn but it’s pretty uncensored in general. Doesn’t have qualms about ethics or illegalities.

        • SurpriZe@lemm.eeOP
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          Does it mean Llama does have that? And how does that affect the performance? I mean the thing about “no qualms about ethics”

          • PerogiBoi@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            I’m sure there’s an uncensored llama somewhere but the ones I’ve tried weren’t truly uncensored.

            In terms of performance what it just means is that if I ask it something mildly sexual or inappropriate, it will answer it without giving an “as an ai language model, I can’t do…” speech.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    10
    ·
    1 month ago

    Don’t get chatgpt plus, just get an API token and use one of the desktop apps/CLIs, it’s pay as you go and way cheaper unless you’re using gpt 4 all day every day or something

      • PhilipTheBucket@ponder.cat
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        I haven’t played around with GPT o1; I just checked, and I don’t have access. I’m not saying it’s necessarily bad without having experienced it. But OpenAI has been getting steadily worse for a while, so I’m assuming that the stuff I’ve interacted with is indicative of the quality of the new stuff. It’s all of a piece.

  • tyler@programming.dev
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 month ago

    I canceled my ChatGPT subscription a month or two ago. It just got completely unreliable. Like someone else said, Claude is way better but they’re both disappointing at this point. I only subscribed to Claude like last week to help solve an incredibly last minute thing. Not sure I’m going to stay subscribed.

  • GeorgeGR@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 month ago

    I’ve run some local llms (3060, 12g vram), and I daily generate images locally (wouldn’t pay for that), but I do pay for a chatgpt subscription. I think it’s worth it for my purposes. Responses are way faster and higher quality than any local model I’ve tried, web search integration, image recognition, mobile app seamless, I use all of those features regularly. Unfortunately I’ve never ver used POE so I can’t compare, sorry.

  • Rolando@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    1 month ago

    FWIW I only ever used those services if they accepted a prepaid credit card. OpenAI didn’t accept prepaid cards when I tried, not sure about Poe. Just something to think about.