Kinocci@alien.topB to AMD@hardware.watchEnglish · 11 months agoDitching CUDA for AMD ROCm for more accessible LLM training and inference.medium.comexternal-linkmessage-square11fedilinkarrow-up12arrow-down10
arrow-up12arrow-down1external-linkDitching CUDA for AMD ROCm for more accessible LLM training and inference.medium.comKinocci@alien.topB to AMD@hardware.watchEnglish · 11 months agomessage-square11fedilink
minus-squarethatrandomnpc@alien.topBlinkfedilinkEnglisharrow-up1·11 months agoCan you pass through amd gpus to containers? Like docker maybe
minus-squareLoafyLemon@alien.topBlinkfedilinkEnglisharrow-up1·11 months agoYep. I just did that the other day to run Stable Diffusion as I could not get it to install required drivers any other way.
Can you pass through amd gpus to containers? Like docker maybe
Yep. I just did that the other day to run Stable Diffusion as I could not get it to install required drivers any other way.