You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you choose an LLM like openrouter/qwen/qwen-2.5-72b-instruct, it fails with:
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/enyst/repos/odie/openhands/llm/llm.py", line 87, in __init__
self.model_info = litellm.get_model_info(self.config.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/enyst/repos/odie/.venv/lib/python3.12/site-packages/litellm/utils.py", line 4983, in get_model_info
raise Exception(
ERROR:root:<class 'Exception'>: This model isn't mapped yet. model=openrouter/qwen/qwen-2.5-72b-instruct, custom_llm_provider=openrouter. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.
Normally, this shouldn't crash, just not have model_info going forward.
OpenHands Installation
Development workflow
OpenHands Version
No response
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered:
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
If you choose an LLM like openrouter/qwen/qwen-2.5-72b-instruct, it fails with:
Normally, this shouldn't crash, just not have
model_info
going forward.OpenHands Installation
Development workflow
OpenHands Version
No response
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: