Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot install on Google Colab #1933

Open
5 of 8 tasks
benjamin-marie opened this issue Sep 27, 2024 · 6 comments · May be fixed by #1989
Open
5 of 8 tasks

Cannot install on Google Colab #1933

benjamin-marie opened this issue Sep 27, 2024 · 6 comments · May be fixed by #1989
Labels
bug Something isn't working

Comments

@benjamin-marie
Copy link

Please check that this issue hasn't been reported before.

  • I searched previous Bug Reports didn't find any similar reports.

Expected Behavior

axolotl should be installed.

Current behaviour

When I ran the installation command on Google Colab (L4 GPU):
!git clone https://github.com/axolotl-ai-cloud/axolotl.git
!cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'

It yields:

ERROR: Cannot install None and axolotl because these package versions have conflicting dependencies.

The conflict is caused by:
axolotl 0.4.1 depends on torch==2.4.1+cu121
accelerate 0.34.2 depends on torch>=1.10.0
bitsandbytes 0.44.0 depends on torch
liger-kernel 0.3.0 depends on torch>=2.1.2
optimum 1.16.2 depends on torch>=1.11
peft 0.13.0 depends on torch>=1.13.0
trl 0.9.6 depends on torch>=1.4.0
xformers 0.0.27 depends on torch==2.3.1

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip to attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

Steps to reproduce

Go on Google Colab.
Select a GPU instance, e.g. L4
And run the installation commands
!git clone https://github.com/axolotl-ai-cloud/axolotl.git
!cd axolotl && pip3 install packaging ninja && pip3 install -e '.[flash-attn,deepspeed]'

Config yaml

No response

Possible solution

Update the requirements.

Which Operating Systems are you using?

  • Linux
  • macOS
  • Windows

Python Version

3.10

axolotl branch-commit

main

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this bug has not been reported yet.
  • I am using the latest version of axolotl.
  • I have provided enough information for the maintainers to reproduce and diagnose the issue.
@benjamin-marie benjamin-marie added the bug Something isn't working label Sep 27, 2024
@nicokim
Copy link

nicokim commented Sep 27, 2024

Same thing happened to me on runpod.
I updated the version of the transformers and xformers in the requirements.txt

xformers==0.0.28
transformers==4.45.1

and it worked somehow haha! Can you try it?

@benjamin-marie
Copy link
Author

Yes, it works.
The requirements must be updated.
I expect new installations of Axolotl to be impossible until it's fixed.

@Hasan-Demez
Copy link

i also ran across the same issue in vscode and the same solution worked for me as well, thank you.

@NanoCode012
Copy link
Collaborator

NanoCode012 commented Oct 15, 2024

Hey, sorry for this. We're trying to make things more clear on this. Colab's version of torch may be incompatible with the specific version required by xformers. Previously, you should install torch==2.3.1 and now 2.4.1 for xformers to be happy. Currently, Colab uses 2.4.1, so this issue should not exist.

I plan to remove dependency on xformers out as it's creating this locking issue.

@Amit30swgoh
Copy link

Python 3.10.12

pip install torch==2.4.1+cu124 torchvision==0.19.1+cu124 torchaudio==2.4.1+cu124 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu124
pip install xformers==0.0.28.post1
pip install torch==2.4.0+cu124 torchvision==0.19.0+cu124 torchaudio==2.4.0+cu124 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu124
pip install xformers==0.0.27.post2
pip install torch==2.3.1+cu121 torchvision==0.18.1+cu121 torchaudio==2.3.1+cu121 torchtext==0.18.0 torchdata==0.8.0 --extra-index-url https://download.pytorch.org/whl/cu121
pip install xformers==0.0.27

Enjoy !

@NanoCode012 NanoCode012 linked a pull request Oct 22, 2024 that will close this issue
@NanoCode012 NanoCode012 linked a pull request Oct 22, 2024 that will close this issue
@NanoCode012
Copy link
Collaborator

I updated the colab example in the linked PR. It uses the instructions on the readme which installs fine. I would recommend anyone having this issue to check it out. Sorry for any confusion with this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants