- cross-posted to:
- localllama
- [email protected]
- cross-posted to:
- localllama
- [email protected]
You must log in or register to comment.
Ollama supports more AMD cards than AMD’s ROCm does. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html#supported-gpus
I’d love to buy an AMD card, but it just feels like they’re not even trying to meet us halfway on this. ROCm needs to be better.
CUDA is the one Nvidia attack on interoperability that AMD did not answer, and consequently, it’s the only one Nvidia kept at.