• Ookami38
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    Depends on what AI you’re looking for. I don’t know of an LLM (a language model,think chatgpt) that works decently on personal hardware, but I also haven’t really looked. For art generation though, look up automatic1111 installation instructions for stable diffusion. If you have a decent GPU (I was running it on a 1060 slowly til I upgraded) it’s a simple enough process to get started, there’s tons of info online about it, and it’s all run on local hardware.

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      I don’t know of an LLM that works decently on personal hardware

      Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.