• Sunoc
    link
    fedilink
    arrow-up
    11
    ·
    8 hours ago

    Wait, just the client? I thought the madlad ran the model on their phone!

    • Smorty [she/her]@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      12
      ·
      8 hours ago

      apparently not. it seems they are refering to the official bs deepseek ui for ur phone. running it on your phone fr is super cool! Imma try that out now - with the smol 1.5B model

      • Sunoc
        link
        fedilink
        arrow-up
        3
        ·
        8 hours ago

        Good luck ! You’ll need it to run it without a GPU…

        • Smorty [she/her]@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          9
          ·
          8 hours ago

          i kno! i’m already running a smol llama model on the phone, and yeaaaa that’s a 2 token per second speed and it makes the phone lag like crazy… but it works!

          currently i’m doing this with termux and ollama, but if there’s some better foss way to run it, i’d be totally happy to use that instead <3