Skip to content

[Bugfix][V1] Fix flashinfer sampling #14815

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Mar 15, 2025

Conversation

DefTruth
Copy link
Contributor

@DefTruth DefTruth commented Mar 14, 2025

small fix base on #14788
Currently, we have errors when using FlashInfer>=v0.2.3 for top-p & top-k sampling. As a workaround, we disable FlashInfer for top-p & top-k sampling by default while FlashInfer>=v0.2.3. The sampling API removes the success return value of all sampling API, which is not compatible with earlier design. reference: https://github.com/flashinfer-ai/flashinfer/releases/tag/v0.2.3

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the v1 label Mar 14, 2025
Copy link
Collaborator

@WoosukKwon WoosukKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the PR!

@WoosukKwon WoosukKwon merged commit acaea3b into vllm-project:main Mar 15, 2025
13 of 14 checks passed
@DefTruth DefTruth deleted the fix-flashinfer-sampling branch March 15, 2025 11:05
@yzh119
Copy link

yzh119 commented Mar 25, 2025

Hi @WoosukKwon @DefTruth , it should be easy to switch to the new API, maybe @xslingcn can help create a new PR for the new API?

@WoosukKwon
Copy link
Collaborator

@yzh119 Thanks for brining this up. We do want to upgrade the version, but didn't have bandwidth recently. We'd really appreciate it if @xslingcn or anyone could take this!

lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
nishith-fujitsu pushed a commit to nishith-fujitsu/vllm that referenced this pull request Apr 9, 2025
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants