You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The LLMGraphTransformer class currently blocks usage of node_properties and relationship_properties parameters when using LLMs that lack native function calling capabilities. This restriction forces users to choose between:
Using lightweight/OSS LLMs without structured property enforcement, or
Requiring expensive/closed models with function calling support.
The text was updated successfully, but these errors were encountered:
Issue #39 is indeed useful for models that support function calling since it can help fix some formatting errors when specifying the method for with_structured_output. However, it's important to note that not all models support function calling due to provider constraints. For example, when using ToGher AI with the model meta-llama/Llama-3.3-70B-Instruct-Turbo-Free, you might encounter the following error:
BadRequestError: Error code: 400 - {'id': '', 'error': {'message': 'meta-llama/Llama-3.3-70B-Instruct-Turbo-Free is not supported for JSON mode/function calling', 'type': 'invalid_request_error', 'param': None, 'code': 'constraints_model'}}
Additionally, setting the parameter ignore_tool_usage to False forces the system to avoid using function calling.
Given these constraints, there is a clear need to implement a new prompt or mechanism that can gracefully handle models without function calling support. This new prompt should enable structured outputs without relying on function calling, ensuring compatibility with a broader range of models. Additionally, it must include the extraction of both node_properties and relationship_properties, so that users can still obtain detailed and structured data even when using models that do not natively support function calling.
The LLMGraphTransformer class currently blocks usage of node_properties and relationship_properties parameters when using LLMs that lack native function calling capabilities. This restriction forces users to choose between:
The text was updated successfully, but these errors were encountered: