Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new model (upstage/solar-pro-preview-instruct) #3040

Open
remy-rec opened this issue Oct 7, 2024 · 1 comment
Open

Add new model (upstage/solar-pro-preview-instruct) #3040

remy-rec opened this issue Oct 7, 2024 · 1 comment

Comments

@remy-rec
Copy link

remy-rec commented Oct 7, 2024

We have released a new model and would like to add it to the HELM leaderboard. We have confirmed that it works well on the local leaderboard.
If a merge is possible, could you let me know by when the merge will be completed?
You can check our model's performance at: https://www.upstage.ai/products/solar-pro-preview.

model_deployments.yaml

  - name: upstage/solar-pro-preview-instruct
    model_name: upstage/solar-pro-preview-instruct
    tokenizer_name: upstage/solar-pro-preview-instruct
    max_sequence_length: 4096
    client_spec:
      class_name: "helm.clients.huggingface_client.HuggingFaceClient"
      args:
        torch_dtype: auto
        trust_remote_code: true

model_metadata.yaml

  - name: upstage/solar-pro-preview-instruct
    display_name: solar-pro-preview-instruct (22B)
    description: solar-pro-preview-instruct (22B) is the most intelligent LLM on a single GPU ([blog](https://www.upstage.ai/products/solar-pro-preview))
    creator_organization_name: Upstage
    access: open
    num_parameters: 22000000000
    release_date: 2024-09-11
    tags: [TEXT_MODEL_TAG, LIMITED_FUNCTIONALITY_TEXT_MODEL_TAG]

tokenizer_configs.yaml

  # Upstage
  - name: upstage/solar-pro-preview-instruct
    tokenizer_spec:
      class_name: "helm.tokenizers.huggingface_tokenizer.HuggingFaceTokenizer"
      args:
        trust_remote_code: true
    end_of_text_token: "<|im_end|>"
    prefix_token: "<|startoftext|>"
@yifanmai
Copy link
Collaborator

Thanks for the suggestion! I'll discuss with the team internally and figure out what we want to do here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants