Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VRAM requirements? #6

Open
DenisKochetov opened this issue May 13, 2024 · 2 comments
Open

VRAM requirements? #6

DenisKochetov opened this issue May 13, 2024 · 2 comments

Comments

@DenisKochetov
Copy link
Contributor

image
Cant fit into 24gb vram. This is sd1.5, why is it so big?

@Zzlongjuanfeng
Copy link
Contributor

@Cmd190
Copy link

Cmd190 commented May 31, 2024

Same problem here. Installing xformers and enabling it leads to problems with kaolin because the torch version gets updated to 2.3. I can't find a xformers version that support pytorch 1.12.

But setting the environment variable PYTORCH_CUDA_ALLOC_CONF to 512 MB (https://stackoverflow.com/questions/73747731/runtimeerror-cuda-out-of-memory-how-can-i-set-max-split-size-mb ) did the job for me. I am now able to run both demo objects on a 24gb gpu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants