They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

  • Lojcs@lemm.ee
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    3 hours ago

    Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.

    • TheDorkfromYork@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      They’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.