Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Linux, Recommended install] play-rocm.sh immediately fatally crashes with huggingface_hub import error #445

Open
ProfessorDey opened this issue Jul 7, 2024 · 3 comments

Comments

@ProfessorDey
Copy link

So following the recommended install method of cloning the client repository and running ./play-rocm.sh, the install goes smoothly then as soon as it starts to load kobold itself it immediately crashes with this error what I can't any reason for. Huggingface_hub is installed, but looking online I can only find 'split_torch_state_dict_into_shards' on their github under src/serialisation/_torch.py, which might mean there is a missing submodule in the ROCm requirements. This is unfortunate because it means I can't use Kobold at all as it won't even start.

Traceback (most recent call last):
File "aiserver.py", line 58, in
from utils import debounce
File "/home/dey/ai/koboldai-client/utils.py", line 12, in
from transformers import PreTrainedModel
File "", line 1039, in _handle_fromlist
File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/home/dey/ai/koboldai-client/runtime/envs/koboldai-rocm/lib/python3.8/site-packages/huggingface_hub/init.py)

@henk717
Copy link
Collaborator

henk717 commented Jul 7, 2024

This branch is not supported for online installers, you can check the one in my branch if that works. You can also check https://koboldai.org/cpp for our more modern sister project that has wider GPU support.

@ProfessorDey
Copy link
Author

Fair enough, I was just following the instructions in the Readme:
Installing KoboldAI on Linux using the KoboldAI Runtime (Easiest)
Clone the URL of this Github repository (For example git clone https://github.com/koboldai/koboldai-client )
AMD user? Make sure ROCm is installed if you want GPU support. Is yours not compatible with ROCm? Follow the usual instructions.
Run play-rocm.sh if you use an AMD GPU supported by ROCm

I didn't realise it had been forked and continued in a seperate project, perhaps it would be worth making a note of that on the readme itself? Thank you for the redirection though, I'll give that a go.

@henk717
Copy link
Collaborator

henk717 commented Jul 7, 2024

The idea is that mine gets upstreamed again, but the update has been such a big undertaking that its been taking a while especially since the AI space constantly changes and forces us to keep changing our backend to keep things functional. The newer developers all want to work on Koboldcpp instead since its currently the better one between the two.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants