-
-
Notifications
You must be signed in to change notification settings - Fork 860
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot install on Google Colab #1933
Comments
Same thing happened to me on runpod. xformers==0.0.28 and it worked somehow haha! Can you try it? |
Yes, it works. |
i also ran across the same issue in vscode and the same solution worked for me as well, thank you. |
Hey, sorry for this. We're trying to make things more clear on this. Colab's version of torch may be incompatible with the specific version required by xformers. Previously, you should install I plan to remove dependency on xformers out as it's creating this locking issue. |
Python 3.10.12 pip install torch==2.4.1+cu124 torchvision==0.19.1+cu124 torchaudio==2.4.1+cu124 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu124 Enjoy ! |
I updated the colab example in the linked PR. It uses the instructions on the readme which installs fine. I would recommend anyone having this issue to check it out. Sorry for any confusion with this. |
Please check that this issue hasn't been reported before.
Expected Behavior
axolotl should be installed.
Current behaviour
When I ran the installation command on Google Colab (L4 GPU):
!git clone https://github.com/axolotl-ai-cloud/axolotl.git
!cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'
It yields:
ERROR: Cannot install None and axolotl because these package versions have conflicting dependencies.
The conflict is caused by:
axolotl 0.4.1 depends on torch==2.4.1+cu121
accelerate 0.34.2 depends on torch>=1.10.0
bitsandbytes 0.44.0 depends on torch
liger-kernel 0.3.0 depends on torch>=2.1.2
optimum 1.16.2 depends on torch>=1.11
peft 0.13.0 depends on torch>=1.13.0
trl 0.9.6 depends on torch>=1.4.0
xformers 0.0.27 depends on torch==2.3.1
To fix this you could try to:
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
Steps to reproduce
Go on Google Colab.
Select a GPU instance, e.g. L4
And run the installation commands
!git clone https://github.com/axolotl-ai-cloud/axolotl.git
!cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'
Config yaml
No response
Possible solution
Update the requirements.
Which Operating Systems are you using?
Python Version
3.10
axolotl branch-commit
main
Acknowledgements
The text was updated successfully, but these errors were encountered: