nvidia
GPU
80GB VRAM
workstation
NVIDIA A100 80GB SXM
Ampere datacenter flagship. 80GB HBM2e at 2 TB/s. Still common at cloud providers.
Released 2020
Overview
Ampere datacenter flagship. 80GB HBM2e at 2 TB/s. Still common at cloud providers.
Specs
| VRAM | 80 GB |
| Power draw | 400 W |
| Released | 2020 |
| MSRP | $17000 |
| Backends | CUDA |
Models that fit
Open-weight models small enough to run on NVIDIA A100 80GB SXM with usable context.
Frequently asked
What models can NVIDIA A100 80GB SXM run?
With 80GB VRAM, the NVIDIA A100 80GB SXM runs 70B models in 4-bit quantization, plus everything smaller. See the model list below for tested combinations.
Does NVIDIA A100 80GB SXM support CUDA?
Yes — NVIDIA A100 80GB SXM is an NVIDIA card with full CUDA support, the most mature local-AI backend. llama.cpp, Ollama, vLLM, and ExLlamaV2 all run natively.
Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify hardware specifications.