Search
Results
SDB:AMD GPGPU - openSUSE Wiki
[rocm] No GPU support after rebuild with ROCm 6.0 (#6) · Issues · Arch Linux / Packaging / Packages / python-pytorch · GitLab
[https://gitlab.archlinux.org/archlinux/packaging/packages/python-pytorch/-/issues/6] - - public:mzimmerm
This site seems to claim there is a known fix. Looks like someone is fixing it???
Installing PyTorch for ROCm — ROCm installation (Linux)
[https://rocm.docs.amd.com/projects/install-on-linux/en/develop/how-to/3rd-party/pytorch-install.html#testing-the-pytorch-installation] - - public:mzimmerm
Installing PyTorch for ROCm - this document claims gfx900 compatibility
Need help with getting results on Ryzen 5600G with RoCm 5.5 - PyTorch Forums
[https://discuss.pytorch.org/t/need-help-with-getting-results-on-ryzen-5600g-with-rocm-5-5/184408] - - public:mzimmerm
ROCm 5.xx ever planning to include gfx90c GPUs? · Issue #1743 · ROCm/ROCm
[https://github.com/ROCm/ROCm/issues/1743] - - public:mzimmerm
Suggested git-build of pytorch on gfx90c FAILED for me
Doesn't ROCm support AMD's integrated GPU (APU)? · Issue #2216 · ROCm/ROCm
[https://github.com/ROCm/ROCm/issues/2216] - - public:mzimmerm
This guy claims successful installation of ROCm on Ubuntu - this seems to be workable for Tumbleweed as well. See the comment “nav9 commented on Jul 16, 2023“
Ditching CUDA for AMD ROCm for more accessible LLM training and inference. | by Rafael Manzano Masapanta | Medium
[https://medium.com/@rafaelmanzanom/ditching-cuda-for-amd-rocm-for-more-accessible-llm-inference-ryzen-apus-edition-92c3649f8f7d] - - public:mzimmerm
Train LLM on AMD APU. In this scenario, we’ll use an APU because most laptops with a Ryzen CPU include an iGPU; specifically, this post should work with iGPUs based on the “GCN 5.0” architecture, or “Vega” for friends. We’ll use an AMD Ryzen 2200G in this post, an entry-level processor equipped with 4C/4T and an integrated GPU.