We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 10611d8 commit fb43deeCopy full SHA for fb43dee
docs/source/getting_started/installation/gpu/rocm.inc.md
@@ -13,6 +13,14 @@ vLLM supports AMD GPUs with ROCm 6.3.
13
14
Currently, there are no pre-built ROCm wheels.
15
16
+However, the [AMD Infinity hub for vLLM](https://hub.docker.com/r/rocm/vllm/tags) offers a prebuilt, optimized
17
+docker image designed for validating inference performance on the AMD Instinct™ MI300X accelerator.
18
+
19
+```{tip}
20
+Please check [LLM inference performance validation on AMD Instinct MI300X](https://rocm.docs.amd.com/en/latest/how-to/performance-validation/mi300x/vllm-benchmark.html)
21
+for instructions on how to use this prebuilt docker image.
22
+```
23
24
### Build wheel from source
25
26
0. Install prerequisites (skip if you are already in an environment/docker with the following installed):
0 commit comments