nvidia
GPU
80GB VRAM
workstation

NVIDIA H100 SXM

Hopper SXM5 — 80GB HBM3 at 3.35 TB/s. The original GPU that trained GPT-4. Cloud-rentable.

Released 2022

Overview

Hopper SXM5 — 80GB HBM3 at 3.35 TB/s. The original GPU that trained GPT-4. Cloud-rentable.

Specs

VRAM80 GB
Power draw700 W
Released2022
MSRP$30000
Backends
CUDA

Models that fit

Open-weight models small enough to run on NVIDIA H100 SXM with usable context.

Frequently asked

What models can NVIDIA H100 SXM run?

With 80GB VRAM, the NVIDIA H100 SXM runs 70B models in 4-bit quantization, plus everything smaller. See the model list below for tested combinations.

Does NVIDIA H100 SXM support CUDA?

Yes — NVIDIA H100 SXM is an NVIDIA card with full CUDA support, the most mature local-AI backend. llama.cpp, Ollama, vLLM, and ExLlamaV2 all run natively.

Reviewed by RunLocalAI Editorial. See our editorial policy for how we research and verify hardware specifications.