Skip to content

Commit 001a9c7

Browse files
[Doc] Update PaliGemma note to a warning (#14565)
Signed-off-by: DarkLight1337 <[email protected]>
1 parent 89cdaa8 commit 001a9c7

File tree

1 file changed

+7
-5
lines changed

1 file changed

+7
-5
lines changed

docs/source/models/supported_models.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -847,7 +847,7 @@ See [this page](#generative-models) for more information on how to use generativ
847847
* ✅︎
848848
* ✅︎
849849
- * `PaliGemmaForConditionalGeneration`
850-
* PaliGemma (see note), PaliGemma 2 (see note)
850+
* PaliGemma ⚠️, PaliGemma 2 ⚠️
851851
* T + I<sup>E</sup>
852852
* `google/paligemma-3b-pt-224`, `google/paligemma-3b-mix-224`, `google/paligemma2-3b-ft-docci-448`, etc.
853853
*
@@ -917,6 +917,12 @@ See [this page](#generative-models) for more information on how to use generativ
917917
<sup>E</sup> Pre-computed embeddings can be inputted for this modality.
918918
<sup>+</sup> Multiple items can be inputted per text prompt for this modality.
919919

920+
:::{warning}
921+
vLLM does not currently support PrefixLM attention mask, so our PaliGemma implementation uses regular causal attention, which causes the model output to be unstable.
922+
923+
We may deprecate this model series in a future release.
924+
:::
925+
920926
:::{note}
921927
`h2oai/h2ovl-mississippi-2b` will be available in V1 once we support backends other than FlashAttention.
922928
:::
@@ -930,10 +936,6 @@ The official `openbmb/MiniCPM-V-2` doesn't work yet, so we need to use a fork (`
930936
For more details, please see: <gh-pr:4087#issuecomment-2250397630>
931937
:::
932938

933-
:::{note}
934-
Currently the PaliGemma model series is implemented without PrefixLM attention mask. This model series may be deprecated in a future release.
935-
:::
936-
937939
:::{note}
938940
To use Qwen2.5-VL series models, you have to install Hugging Face Transformers library from source via `pip install git+https://github.com/huggingface/transformers`.
939941
:::

0 commit comments

Comments
 (0)