TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-25 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square16fedilinkarrow-up140arrow-down112file-text
arrow-up128arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-25 months agomessage-square16fedilinkfile-text
minus-squarekata1ystlinkfedilinkEnglisharrow-up4·edit-25 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
Removed by mod