• Mora@pawb.social
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔

    • kyoji@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      23 hours ago

      I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think

    • Lurker
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      You can try from lowest to bigger. You probably can run biggest too but it will be slow.