You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
I encountered a TypeError while using the LLMPredictor class with a custom FlanLLM class. It appears to be related to the _identifying_params method.
Details:
When trying to create an instance of LLMPredictor with llm=FlanLLM(), I received the following error:
TypeError: 'method' object is not iterable
Expected Behavior
I expected to create an instance of LLMPredictor successfully using my custom FlanLLM class without encountering any errors.
Actual Behavior
I received a TypeError when attempting to create an instance of LLMPredictor. The error message indicates that there's an issue with the _identifying_params method in the FlanLLM class.
Steps to Reproduce
Create a custom FlanLLM class as follows:
class FlanLLM(LLM):
model_name = "google/flan-t5-large"
pipeline = pipeline("text2text-generation", model=model_name, device=0, model_kwargs={"torch_dtype": torch.bfloat16})
I have ensured that the FlanLLM class correctly inherits from the LLM class.
The error occurs at the line where LLMPredictor is instantiated.
I am using the appropriate versions of the required libraries and packages.
Screenshots or Log Output
[If applicable, include screenshots or log output that may help diagnose the issue.]
Possible Solutions
I'm not sure what is causing this TypeError. It appears to be related to the _identifying_params method, but I'm unsure how to resolve it. Any guidance or suggestions would be greatly appreciated.
Steps Taken to Resolve
I have reviewed my code, checked for typos, and ensured that the method names and parameters match the expected format. However, I have not been able to resolve this issue on my own.
Note: Please let me know if you need any additional information or if there are specific steps I should take to troubleshoot this issue further.
The text was updated successfully, but these errors were encountered:
Issue Description
Summary:
I encountered a TypeError while using the
LLMPredictor
class with a customFlanLLM
class. It appears to be related to the_identifying_params
method.Details:
When trying to create an instance of
LLMPredictor
withllm=FlanLLM()
, I received the following error:Expected Behavior
I expected to create an instance of
LLMPredictor
successfully using my customFlanLLM
class without encountering any errors.Actual Behavior
I received a TypeError when attempting to create an instance of
LLMPredictor
. The error message indicates that there's an issue with the_identifying_params
method in theFlanLLM
class.Steps to Reproduce
Create a custom
FlanLLM
class as follows:class FlanLLM(LLM):
model_name = "google/flan-t5-large"
pipeline = pipeline("text2text-generation", model=model_name, device=0, model_kwargs={"torch_dtype": torch.bfloat16})
def _call(self, prompt, stop=None):
return self.pipeline(prompt, max_length=9999)[0]["generated_text"]
def _identifying_params(self):
return {"name_of_model": self.model_name}
def _llm_type(self):
return "custom"
FlanLLM
and attempt to create an instance ofLLMPredictor
:Code Snippet
Additional Information
FlanLLM
class correctly inherits from theLLM
class.LLMPredictor
is instantiated.Screenshots or Log Output
[If applicable, include screenshots or log output that may help diagnose the issue.]
Possible Solutions
I'm not sure what is causing this TypeError. It appears to be related to the
_identifying_params
method, but I'm unsure how to resolve it. Any guidance or suggestions would be greatly appreciated.Steps Taken to Resolve
I have reviewed my code, checked for typos, and ensured that the method names and parameters match the expected format. However, I have not been able to resolve this issue on my own.
Note: Please let me know if you need any additional information or if there are specific steps I should take to troubleshoot this issue further.
The text was updated successfully, but these errors were encountered: