• Naz
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    You need a lot of VRAM and a large visual model for higher complexity.

    Lower VRAM means you run models that only do one thing consistently/well.

    See: FLUX

    • bradd@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?