Ziggurat@fedia.io to AI Generated Images · 1 day agoLadies and gentleman, this is your catpainfedia.ioimagemessage-square10fedilinkarrow-up121arrow-down11
arrow-up120arrow-down1imageLadies and gentleman, this is your catpainfedia.ioZiggurat@fedia.io to AI Generated Images · 1 day agomessage-square10fedilink
minus-squareNazlinkfedilinkEnglisharrow-up1·3 hours agoYou need a lot of VRAM and a large visual model for higher complexity. Lower VRAM means you run models that only do one thing consistently/well. See: FLUX
minus-squarebradd@lemmy.worldlinkfedilinkEnglisharrow-up1·2 hours agoI have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?
You need a lot of VRAM and a large visual model for higher complexity.
Lower VRAM means you run models that only do one thing consistently/well.
See: FLUX
I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?