Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PandasAI not working with LLM from HuggingFace via HuggingFaceTextGen method. Not able to Chat #1575

Open
SimranAnand1 opened this issue Jan 31, 2025 · 20 comments
Labels
bug Something isn't working

Comments

@SimranAnand1
Copy link

Discussed in #1574

Originally posted by SimranAnand1 January 31, 2025
Hi, I am facing issue: AttributeError: 'NoneType' object has no attribute 'type'

when I am trying to use a LLM other than BambooLLM. My LLM is pandasai_huggingface.huggingface_text_gen.HuggingFaceTextGen object

I have followed as given in the docs of PandasAI library:
https://docs.getpanda.ai/v3/large-language-models#huggingface-models

The repository has changed code significantly and I see that no LLms other than BambooLLM are present in llm folder now.. But earlier they were compatible. Please help me with a code so it would work for custom LLM (HuggingFacetextGen model). I have tried a lot to make it work but the LLM is always getting identified as None object only.

Thank you in advance! :)

@SimranAnand1
Copy link
Author

Always getting this error as response:

raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

my code was: import pandasai as pai
from pandasai-huggingface import HuggingFaceTextGen

llm = HuggingFaceTextGen(inference_server_url="http://127.0.0.1:8080")

pai.config.set({"llm": llm})

please look into it, and let me know if there can be a way by which HuggingFaceTextGen models can be compatible with PandasAI

@gdcsinaptik
Copy link
Collaborator

Hey @SimranAnand1 with the 3.0, LLMs have different packages to use different LLMs. In this way the library is more lightweight. Here is it the documentation, which I see you already found: https://docs.getpanda.ai/v3/large-language-models. The llms different from Bamboo are now in the extensions folder. How didi you find the installation package?
Thanks also for opening the ticket, we will look into it

@SimranAnand1
Copy link
Author

Hey @gdcsinaptik , Thanks for response. I have been trying a lot to follow the docs and implement, but with HuggingFaceTextGen model, getting this error: raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Since HuggingFaceTextGen.call() returns a string (as mentioned in https://github.com/sinaptik-ai/pandas-ai/blob/f6673670ddde7c0e2531f0a103e5dd7d73330b7d/extensions/llms/huggingface/pandasai_huggingface/huggingface_text_gen.py) , I think we should not attempt res.json() inside the library.

The library should remove the res.json() line inside chat function of pandasai. Please let me know if HuggingFaceTextGen models work with PandasAI library or not.

I found installations package from this page: https://github.com/sinaptik-ai/pandas-ai/tree/f6673670ddde7c0e2531f0a103e5dd7d73330b7d/extensions/llms/huggingface

Looking forward to the solution soon! Thanks :)

@SimranAnand1
Copy link
Author

Hey @SimranAnand1 with the 3.0, LLMs have different packages to use different LLMs. In this way the library is more lightweight. Here is it the documentation, which I see you already found: https://docs.getpanda.ai/v3/large-language-models. The llms different from Bamboo are now in the extensions folder. How didi you find the installation package? Thanks also for opening the ticket, we will look into it

The call function inside the file https://github.com/sinaptik-ai/pandas-ai/blob/main/extensions/llms/huggingface/pandasai_huggingface/huggingface_text_gen.py returns resp.generated_text which is what is making pai.chat function throw a JSON Decode Error @gdcsinaptik

I would suggest the llm class of PandasAI needs modification to incorporate other LLM instances such as HuggingFaceTextGen.

@SimranAnand1
Copy link
Author

Hey @SimranAnand1 with the 3.0, LLMs have different packages to use different LLMs. In this way the library is more lightweight. Here is it the documentation, which I see you already found: https://docs.getpanda.ai/v3/large-language-models. The llms different from Bamboo are now in the extensions folder. How didi you find the installation package? Thanks also for opening the ticket, we will look into it

The call function inside the file https://github.com/sinaptik-ai/pandas-ai/blob/main/extensions/llms/huggingface/pandasai_huggingface/huggingface_text_gen.py returns resp.generated_text which is what is making pai.chat function throw a JSON Decode Error @gdcsinaptik

I would suggest the llm class of PandasAI needs modification to incorporate other LLM instances such as HuggingFaceTextGen.

Earlier the old version of PandasAI was compatible with HuggingFace models but now the gir repository has removed the llms from llms folder apart from BambooLLM. And extensions are not compatible with the library due to chat() and call() throwing JSON Decode Error while returning responses.

@gdcsinaptik
Copy link
Collaborator

@SimranAnand1 you should be able to use the OpenAI llm data extension without issue. We are investigating the HuggingFace models related issue in the 3.0 :)

@SimranAnand1
Copy link
Author

@gdcsinaptik Thank you. As I would like to go with HuggingFace models, I will try with 2.4 version of pandas ai and will wait for the fix. Wanted to know if 2.4 version requires pandasai-huggingface extension installation or not because I see pandasai.llm folder (that works with 2.4) has no HF models mentioned there.. Would be grateful if you could share docs or guide to refer on how to implement chat method with 2.4 version. Thanks a lot!

@SimranAnand1
Copy link
Author

@gdcsinaptik The pandasai-huggingface library is needed to use HuggingFace LLMs, however when I try using it with pandasai 2.4 version it says " pandasai-huggingface requires pandasai version >=3.0.0b4" due to which I am unable to go ahead with 2.4

Request you to please look into it and let me know the workaround. Thank you
@gventuri

@SimranAnand1
Copy link
Author

As per v2, the docs says:
from pandasai.llm import HuggingFaceTextGen
from pandasai import SmartDataframe

llm = HuggingFaceTextGen(
inference_server_url="http://127.0.0.1:8080"
)
df = SmartDataframe("data.csv", config={"llm": llm})

But it doesn't work because pandasai.llm does not have HuggingFaceTextgen library anymore

@gdcsinaptik
Copy link
Collaborator

@SimranAnand1 have you reverted your local library to 2.4.2?
pip install "pandasai=2.4.2"

@SimranAnand1
Copy link
Author

SimranAnand1 commented Feb 3, 2025

@gdcsinaptik I did that but got this error: pandasai-huggingface 0.1.3 requires pandasai>=3.0.0b4, but you have pandasai 2.4.2 which is incompatible.
Due to which I am not able to run the huggingface models with 2.4.2

Also, after reverting I get:
lib/python3.10/site-packages/pandasai_huggingface/huggingface_text_gen.py", line 5, in
from pandasai.core.prompts.base import BasePrompt
ModuleNotFoundError: No module named 'pandasai.core'

@gdcsinaptik
Copy link
Collaborator

gdcsinaptik commented Feb 3, 2025

Please @SimranAnand1 do not try to use the documentation from the 3.0 while using the 2.4. In the documentation main page, there is a version picker on the top left. Ensure you click on 2 when using 2.x, and 3 when using 3.x. Ensure the docs you read and the corresponding code you use matches the library version you are using.
pip install "pandasai=2.4.2"
How to use HuggingFace models in 2.4? https://docs.getpanda.ai/v2/llms#huggingface-via-text-generation
Please read carefully the documentation (the correct one depending on which version you are using) and ensure to match your code with the pandasai version you are using.

@SimranAnand1
Copy link
Author

@gdcsinaptik I have followed the docs of v2 you shared now and got the output saying max retries error (my inference url is on port 8099 and proxy is correctly enabled too):

raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Exception in APILogger: HTTPSConnectionPool(host='api.domer.ai', port=443): Max retries exceeded with url: /api/log/add (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7b7a7c6f1a50>: Failed to resolve 'api.domer.ai' ([Errno -2] Name or service not known)"))
LLM Test Response: Unfortunately, I was not able to answer your question, because of the following error:

HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused'))

@SimranAnand1
Copy link
Author

SimranAnand1 commented Feb 3, 2025

@gdcsinaptik I have checked and tested again with documentation for version 2.4.2, but now though the inference URL of HF model is running (via docker container), the response is not generated by the Agent due to MaxRetries exceeded and Connection Error while connecting to api.domer.ai

I had faced this issue before as well before going ahead with the new version.. How do I fix this error111
Does the 2.4.2 version not allow requests anymore due to connection broken?

@SimranAnand1
Copy link
Author

After setting config to HuggingFace LLM model, the chat function does not work because its supported llms are Bamboo only. Could you please look into it if you are able to get any response via any huggingface model other than open ai? with version 2.4.2. I have been facing api.domer.ai related error with 2.4.2 version.

@gdcsinaptik How about Local LM Studio, is that compatible with the new version or old version? Because HF models are not working, I am planning to try with Local LM now. Please let me know

@SimranAnand1
Copy link
Author

SimranAnand1 commented Feb 3, 2025

After restarting docker container for running HuggingFace Inference server, I get: ConnectionResetError: [Errno 104] Connection reset by peer. Seems like inference url is not running hence no response is returned.

@SimranAnand1
Copy link
Author

@gdcsinaptik could you please let me know the fix for this: for version 2.4.2, though the inference URL of HF model is running (via docker container), the response is not generated by the Agent due to MaxRetries exceeded and Connection Error while connecting to api.domer.ai

I had faced this issue before as well before going ahead with the new version.. How do I fix this error111
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Exception in APILogger: HTTPSConnectionPool(host='api.domer.ai', port=443): Max retries exceeded with url: /api/log/add (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7b7a7c6f1a50>: Failed to resolve 'api.domer.ai' ([Errno -2] Name or service not known)"))
LLM Test Response: Unfortunately, I was not able to answer your question, because of the following error:

HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Does the 2.4.2 version not allow requests anymore due to connection broken?

Or else, I shall wait for new version to be compatible with HF models

@gventuri gventuri added the bug Something isn't working label Feb 4, 2025
@SimranAnand1
Copy link
Author

Any update on the fix for this bug? Seems like there is an incompatibility with Text Generation inference server connection.. I would really like to know a workaround to be able to use PandasAI seamlessly with HuggingFace models.

@gventuri
Copy link
Collaborator

gventuri commented Feb 6, 2025

Hi @SimranAnand1 unfortunately no news yet. We are working on a way to integrate some library handling the support of all LLMs for us, in a way that we can rather focus on the core value of PandaAI. Will let you know as soon as we have news!

@SimranAnand1
Copy link
Author

SimranAnand1 commented Feb 6, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants