Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: use custom_llm_provider from kwargs if provided #9698

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

lwcobo
Copy link

@lwcobo lwcobo commented Apr 2, 2025

Title

In litellm.main.aembedding, use custom_llm_provider from kwargs if provided

Relevant issues

For locally deployed embedding models, custom_llm_provider is explicitly provided via kwargs, and it's not possible to infer the provider from the model name.

Pre-Submission checklist

N/A

Type

🐛 Bug Fix

Changes

### EMBEDDING ENDPOINTS ####################
@client
async def aembedding(*args, **kwargs) -> EmbeddingResponse:
    ...
    _, custom_llm_provider, _, _ = get_llm_provider(
        model=model, 
        custom_llm_provider=kwargs.get("custom_llm_provider", None),
        api_base=kwargs.get("api_base", None),
    )
    ...

Copy link

vercel bot commented Apr 2, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 2, 2025 6:24am

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant