Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Bailing LLM from ALIPAY for #3487 #3543

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

cuauty
Copy link

@cuauty cuauty commented Sep 30, 2024

Why are these changes needed?

This PR implements the support for Bailing reasoning. The Bailing LLM provides the HTTP end point for reasoning and can be acessed by HTTP Post.

Related issue number (if applicable)

Close #3487

Checks

  • I've run format.sh to lint the changes in this PR.
  • I've included any doc changes needed.
  • I've made sure the relevant tests are passing (if applicable).

@cuauty cuauty changed the title Add the support of Bailing LLM Add the support of Bailing LLM for #3487 Oct 3, 2024
@leywar
Copy link

leywar commented Oct 4, 2024

Please take a look at this pull request. If you have any questions, please feel free to contact us. Thank you very much.

Copy link

@leywar leywar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cuauty cuauty changed the title Add the support of Bailing LLM for #3487 Support Bailing LLM from ALIPAY for #3487 Oct 9, 2024
@cuauty
Copy link
Author

cuauty commented Oct 9, 2024

@infwinston

Could you help review this PR for Bailing LLM from ALIPAY? If any comments please let us know. Thank you.

@cuauty
Copy link
Author

cuauty commented Oct 11, 2024

@infwinston

I just change the file "fastchat/serve/gradio_web_server.py" in this PR and use new Context class replacing List. This change can make the command 'python -m fastchat.serve.gradio_web_server --controller "" --share --register xxxx' cowork with your code sync in PR #3546.

Could you help review this PR? If any comments please let me know.

@fengmk2
Copy link

fengmk2 commented Oct 17, 2024

LGTM

Define the conversation template for Bailing LLM and provide the iterator.
Including:
temperature
top_p
top_k
max_tokens
New code defines the Context class and use it as input parameter.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support Bailing LLM from ALIPAY
3 participants