• hotdoge42@feddit.de
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    3
    ·
    edit-2
    1 year ago

    That’s wrong. You can do it on your home PC with stable diffusion.

    • ᗪᗩᗰᑎ@lemmy.ml
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      3
      ·
      1 year ago

      And a lot of those require models that are multiple Gigabytes in size that then need to be loaded into memory and are processed on a high end video card that would generate enough heat to ruin your phones battery if they could somehow shrink it to fit inside a phone. This just isn’t feasible on phones yet. Is it technically possible today? Yes, absolutely. Are the tradeoffs worth it? Not for the average person.

      • diomnep@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        “He’s off by multiple orders of magnitude, and he doesn’t even mention the resource that GenAI models require in large amounts (GPU), but he’s not wrong”