Ziggurat@fedia.io to AI Generated Images · 1 month agoLadies and gentleman, this is your catpainfedia.ioimagemessage-square10fedilinkarrow-up127arrow-down13
arrow-up124arrow-down1imageLadies and gentleman, this is your catpainfedia.ioZiggurat@fedia.io to AI Generated Images · 1 month agomessage-square10fedilink
minus-squareNazlinkfedilinkEnglisharrow-up1·1 month agoYou need a lot of VRAM and a large visual model for higher complexity. Lower VRAM means you run models that only do one thing consistently/well. See: FLUX
minus-squarebradd@lemmy.worldlinkfedilinkEnglisharrow-up1·1 month agoI have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?
You need a lot of VRAM and a large visual model for higher complexity.
Lower VRAM means you run models that only do one thing consistently/well.
See: FLUX
I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?