• Tar_Alcaran
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    The demo looks pretty impressive, but it’s a prerecorded demo we know nothing about. SO many AI companies have been lying about their benchmarks.

  • Karkitoo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    Looks impressive and it’s truly open-source.

    However, I see it requires CUDA. Could it run anyway:

    1. Without this?
    2. With AMD hardware?
    3. On mobile (as the model is only 1B) ?
    • thickertoofan@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I think the bigger bottleneck is SLAM, running that is intensive, it wont directly run on video, and SLAM is tough i guess, reading the repo doesn’t give any clues of it being able to run on CPU inference.