Wanted to share a resource I stumbled on that I can’t wait to try and integrate into my projects.

A GPT4All model is a 3GB - 8GB file that you can download and plug into the GPT4All open-source ecosystem software. Nomic AI supports and maintains this software ecosystem to enforce quality and security alongside spearheading the effort to allow any person or enterprise to easily train and deploy their own on-edge large language models.

  • @[email protected]
    link
    fedilink
    English
    271 year ago

    I’ve got this running. And it’s fun!

    But it’s also bad compared to chatgpt, or even bing.

    • /home/pineapplelover
      link
      fedilink
      English
      41 year ago

      Is there a free and better chatgpt alternative out there right now? I’ve gone through multiple and none are as good as chatgpt right now.

      • @[email protected]
        link
        fedilink
        English
        231 year ago

        I believe Claude 2 is the best LLM option currently, if you live in the US or UK or have a VPN.

        • Phoenix
          link
          fedilink
          English
          11 year ago

          Claude 2 isn’t free though, is it?

          Either way, it does depends on what you want to use it for. Claude 2 is very biased towards positivity and it can be like pulling teeth if you’re asking it to generate anything it even remotely disapproves of. In that sense, Claude 1 is the superior option.

        • theharber
          link
          English
          11 year ago

          I’m using it from Canada without a VPN, just using the email login method (with an iCloud account if it matters).

    • @[email protected]
      link
      fedilink
      English
      11 year ago

      Yeah, if you use OpenAi’s api key, it’s cheaper and a bit more private than their website. I think it’s like $0.1/day for ~100 queries

      • @[email protected]
        link
        fedilink
        English
        61 year ago

        I plugged GPT-4 into my discord bot.

        It’s $.03 for 1000 tokens. That translates to about 3 or 4 messages.

        gpt-3.5-turbo is almost as good and way cheaper at .0015 per 1k tokens.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          Depends on how you use context. If you don’t need it to have memory, it’s much, much cheaper

        • Phoenix
          link
          fedilink
          English
          11 year ago

          Three cents for every 1k prompt tokens. You pay another six cents per 1k generated tokens in addition to that.

          At 8k context size, this adds up quickly. Depending on what you send, you can easily be out ~thirty cents per generation.