Search
Results
AMD unwraps Ryzen AI 300 series ‘Strix Point’ processors — 50 TOPS of AI performance, Zen 5c density cores come to Ryzen 9 for the first time | Tom's Hardware
[https://www.tomshardware.com/pc-components/cpus/amd-unwraps-ryzen-ai-300-series-strix-point-processors-50-tops-of-ai-performance-zen-5c-density-cores-come-to-ryzen-9-for-the-first-time] - - public:mzimmerm
AMD Ryzen AI CPUs & Radeon 7000 GPUs Can Run Localized Chatbots Using LLMs Just Like NVIDIA's Chat With RTX
[https://wccftech.com/amd-ryzen-ai-cpus-radeon-7000-gpus-localized-chatbot-llms-like-nvidia-chat-with-rtx/] - - public:mzimmerm
LM Studio can be installed on Linux with APU or GPU (looks like it needs the AI CPU though??) and run LLM. Install on Laptop and test if it works.
AMD Unveils Ryzen 8000G Series Processors: Zen 4 APUs For Desktop with Ryzen AI
[https://www.anandtech.com/show/21208/amd-unveils-ryzen-8000g-series-processors-zen-4-apus-for-desktop-with-ryzen-ai] - - public:mzimmerm
8000G is the APU series for AI
(1) Interesting cheap GPU option: Instinct Mi50 : LocalLLaMA
[https://www.reddit.com/r/LocalLLaMA/comments/1b5ie1t/interesting_cheap_gpu_option_instinct_mi50/] - - public:mzimmerm
AMD seems to sell these accelerators, which are like video cards.
Ditching CUDA for AMD ROCm for more accessible LLM training and inference. | by Rafael Manzano Masapanta | Medium
[https://medium.com/@rafaelmanzanom/ditching-cuda-for-amd-rocm-for-more-accessible-llm-inference-ryzen-apus-edition-92c3649f8f7d] - - public:mzimmerm
Train LLM on AMD APU. In this scenario, we’ll use an APU because most laptops with a Ryzen CPU include an iGPU; specifically, this post should work with iGPUs based on the “GCN 5.0” architecture, or “Vega” for friends. We’ll use an AMD Ryzen 2200G in this post, an entry-level processor equipped with 4C/4T and an integrated GPU.
I turned a $95 AMD APU into a 16GB VRAM GPU and it can run stable diffusion! The chip is 4600G. 5600G or 5700G also works. : Amd
[https://old.reddit.com/r/Amd/comments/15t0lsm/i_turned_a_95_amd_apu_into_a_16gb_vram_gpu_and_it/] - - public:mzimmerm