• Lurker
    link
    fedilink
    arrow-up
    48
    arrow-down
    2
    ·
    10 days ago

    Deepseek is good locally.

    • Mora@pawb.social
      link
      fedilink
      arrow-up
      1
      ·
      9 days ago

      As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔

      • Lurker
        link
        fedilink
        arrow-up
        3
        ·
        9 days ago

        You can try from lowest to bigger. You probably can run biggest too but it will be slow.

      • kyoji@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        8 days ago

        I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think