Kinocci@alien.topB to AMD@hardware.watchEnglish · 1 year agoDitching CUDA for AMD ROCm for more accessible LLM training and inference.medium.comexternal-linkmessage-square11fedilinkarrow-up12arrow-down10
arrow-up12arrow-down1external-linkDitching CUDA for AMD ROCm for more accessible LLM training and inference.medium.comKinocci@alien.topB to AMD@hardware.watchEnglish · 1 year agomessage-square11fedilink
minus-squarethatrandomnpc@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoCan you pass through amd gpus to containers? Like docker maybe
minus-squareLoafyLemon@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoYep. I just did that the other day to run Stable Diffusion as I could not get it to install required drivers any other way.
Can you pass through amd gpus to containers? Like docker maybe
Yep. I just did that the other day to run Stable Diffusion as I could not get it to install required drivers any other way.