This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
[Bug]: Unable to integrate self-hosted llm in LLMExtractionStrategy (provider - vllm) #933
Labels
You can continue the conversation there. Go to discussion →
crawl4ai version
0.5.0
Expected Behavior
Hi @unclecode thanks for creating such a powerful open-source resource for us devs. I am using crawl4ai to extract and process data from some weburl, but I am unable to integrate my own deployed llm model. The LLMCondig should use the 'api_base' endpoint url. The endpoint requires bearer auth, which should be in 'api_token' .
The llm endpoint can be called in a python request like:-
Current Behavior
In current approach, it throws
"litellm.APIError: APIError: Hosted_vllmException - <html>\r\n<head><title>301 Moved Permanently</title></head>\r\n<body>\r\n<center><h1>301 Moved Permanently</h1></center>\r\n<hr><center>cloudflare</center>\r\n</body>\r\n</html>"
error.Kindly help me integrating my own-deployed llm in crawl4ai's LLMExtractionStrategy method.
Is this reproducible?
Yes
Inputs Causing the Bug
Steps to Reproduce
Code snippets
OS
Windows
Python version
3.12.8
Browser
Chrome
Browser version
134.0.6998.178
Error logs & Screenshots (if applicable)
The text was updated successfully, but these errors were encountered: