amd
GPU
192GB VRAM
workstation
AMD Instinct MI300X
192GB HBM3 datacenter card. Used by Microsoft, Oracle, Meta cloud deployments.
Released 2023
Overview
192GB HBM3 datacenter card. Used by Microsoft, Oracle, Meta cloud deployments.
Specs
| VRAM | 192 GB |
| Power draw | 750 W |
| Released | 2023 |
| MSRP | $15000 |
| Backends | ROCm |
Models that fit
Open-weight models small enough to run on AMD Instinct MI300X with usable context.
Frequently asked
What models can AMD Instinct MI300X run?
With 192GB VRAM, the AMD Instinct MI300X runs 70B models in 4-bit quantization, plus everything smaller. See the model list below for tested combinations.
Does AMD Instinct MI300X support CUDA?
No — AMD Instinct MI300X is an AMD card. Use ROCm (Linux) or the Vulkan backend in llama.cpp instead. CUDA-only tools won't work.
Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify hardware specifications.