-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PandasAI not working with LLM from HuggingFace via HuggingFaceTextGen method. Not able to Chat #1575
Comments
Always getting this error as response: raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) my code was: import pandasai as pai llm = HuggingFaceTextGen(inference_server_url="http://127.0.0.1:8080") pai.config.set({"llm": llm}) please look into it, and let me know if there can be a way by which HuggingFaceTextGen models can be compatible with PandasAI |
Hey @SimranAnand1 with the 3.0, LLMs have different packages to use different LLMs. In this way the library is more lightweight. Here is it the documentation, which I see you already found: https://docs.getpanda.ai/v3/large-language-models. The llms different from Bamboo are now in the extensions folder. How didi you find the installation package? |
Hey @gdcsinaptik , Thanks for response. I have been trying a lot to follow the docs and implement, but with HuggingFaceTextGen model, getting this error: raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) Since HuggingFaceTextGen.call() returns a string (as mentioned in https://github.com/sinaptik-ai/pandas-ai/blob/f6673670ddde7c0e2531f0a103e5dd7d73330b7d/extensions/llms/huggingface/pandasai_huggingface/huggingface_text_gen.py) , I think we should not attempt res.json() inside the library. The library should remove the res.json() line inside chat function of pandasai. Please let me know if HuggingFaceTextGen models work with PandasAI library or not. I found installations package from this page: https://github.com/sinaptik-ai/pandas-ai/tree/f6673670ddde7c0e2531f0a103e5dd7d73330b7d/extensions/llms/huggingface Looking forward to the solution soon! Thanks :) |
The call function inside the file https://github.com/sinaptik-ai/pandas-ai/blob/main/extensions/llms/huggingface/pandasai_huggingface/huggingface_text_gen.py returns resp.generated_text which is what is making pai.chat function throw a JSON Decode Error @gdcsinaptik I would suggest the llm class of PandasAI needs modification to incorporate other LLM instances such as HuggingFaceTextGen. |
Earlier the old version of PandasAI was compatible with HuggingFace models but now the gir repository has removed the llms from llms folder apart from BambooLLM. And extensions are not compatible with the library due to chat() and call() throwing JSON Decode Error while returning responses. |
@SimranAnand1 you should be able to use the OpenAI llm data extension without issue. We are investigating the HuggingFace models related issue in the 3.0 :) |
@gdcsinaptik Thank you. As I would like to go with HuggingFace models, I will try with 2.4 version of pandas ai and will wait for the fix. Wanted to know if 2.4 version requires pandasai-huggingface extension installation or not because I see pandasai.llm folder (that works with 2.4) has no HF models mentioned there.. Would be grateful if you could share docs or guide to refer on how to implement chat method with 2.4 version. Thanks a lot! |
@gdcsinaptik The pandasai-huggingface library is needed to use HuggingFace LLMs, however when I try using it with pandasai 2.4 version it says " pandasai-huggingface requires pandasai version >=3.0.0b4" due to which I am unable to go ahead with 2.4 Request you to please look into it and let me know the workaround. Thank you |
As per v2, the docs says: llm = HuggingFaceTextGen( But it doesn't work because pandasai.llm does not have HuggingFaceTextgen library anymore |
@SimranAnand1 have you reverted your local library to 2.4.2? |
@gdcsinaptik I did that but got this error: pandasai-huggingface 0.1.3 requires pandasai>=3.0.0b4, but you have pandasai 2.4.2 which is incompatible. Also, after reverting I get: |
Please @SimranAnand1 do not try to use the documentation from the 3.0 while using the 2.4. In the documentation main page, there is a version picker on the top left. Ensure you click on 2 when using 2.x, and 3 when using 3.x. Ensure the docs you read and the corresponding code you use matches the library version you are using. |
@gdcsinaptik I have followed the docs of v2 you shared now and got the output saying max retries error (my inference url is on port 8099 and proxy is correctly enabled too): raise ConnectionError(e, request=request) HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused')) |
@gdcsinaptik I have checked and tested again with documentation for version 2.4.2, but now though the inference URL of HF model is running (via docker container), the response is not generated by the Agent due to MaxRetries exceeded and Connection Error while connecting to api.domer.ai I had faced this issue before as well before going ahead with the new version.. How do I fix this error111 |
After setting config to HuggingFace LLM model, the chat function does not work because its supported llms are Bamboo only. Could you please look into it if you are able to get any response via any huggingface model other than open ai? with version 2.4.2. I have been facing api.domer.ai related error with 2.4.2 version. @gdcsinaptik How about Local LM Studio, is that compatible with the new version or old version? Because HF models are not working, I am planning to try with Local LM now. Please let me know |
After restarting docker container for running HuggingFace Inference server, I get: ConnectionResetError: [Errno 104] Connection reset by peer. Seems like inference url is not running hence no response is returned. |
@gdcsinaptik could you please let me know the fix for this: for version 2.4.2, though the inference URL of HF model is running (via docker container), the response is not generated by the Agent due to MaxRetries exceeded and Connection Error while connecting to api.domer.ai I had faced this issue before as well before going ahead with the new version.. How do I fix this error111 HTTPConnectionPool(host='127.0.0.1', port=8099): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7b7a7c6f11e0>: Failed to establish a new connection: [Errno 111] Connection refused')) Or else, I shall wait for new version to be compatible with HF models |
Any update on the fix for this bug? Seems like there is an incompatibility with Text Generation inference server connection.. I would really like to know a workaround to be able to use PandasAI seamlessly with HuggingFace models. |
Hi @SimranAnand1 unfortunately no news yet. We are working on a way to integrate some library handling the support of all LLMs for us, in a way that we can rather focus on the core value of PandaAI. Will let you know as soon as we have news! |
Okay sure, Thank you. Looking forward to it for seamless integration of
models.
…On Thu, 6 Feb, 2025, 12:49 pm Gabriele Venturi, ***@***.***> wrote:
Hi @SimranAnand1 <https://github.com/SimranAnand1> unfortunately no news
yet. We are working on a way to integrate some library handling the support
of all LLMs for us, in a way that we can rather focus on the core value of
PandaAI. Will let you know as soon as we have news!
—
Reply to this email directly, view it on GitHub
<#1575 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQCQPGMM2RCMHJBWFE5L4XT2OMEG7AVCNFSM6AAAAABWIACIKGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMMZZGAYTSNJSHA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Discussed in #1574
Originally posted by SimranAnand1 January 31, 2025
Hi, I am facing issue: AttributeError: 'NoneType' object has no attribute 'type'
when I am trying to use a LLM other than BambooLLM. My LLM is pandasai_huggingface.huggingface_text_gen.HuggingFaceTextGen object
I have followed as given in the docs of PandasAI library:
https://docs.getpanda.ai/v3/large-language-models#huggingface-models
The repository has changed code significantly and I see that no LLms other than BambooLLM are present in llm folder now.. But earlier they were compatible. Please help me with a code so it would work for custom LLM (HuggingFacetextGen model). I have tried a lot to make it work but the LLM is always getting identified as None object only.
Thank you in advance! :)
The text was updated successfully, but these errors were encountered: