Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much VRAM is needed to run, why can't I use Google Colab to run it? It doesn't report any errors, interrupts directly, and there are no issues with video memory or memory overflow #7

Open
libai-lab opened this issue Oct 24, 2024 · 3 comments

Comments

@libai-lab
Copy link

How much VRAM is needed to run, why can't I use Google Colab to run it? It doesn't report any errors, interrupts directly, and there are no issues with video memory or memory overflow

@HpWang-whu
Copy link
Contributor

HpWang-whu commented Oct 24, 2024

VistaDream need about 22G VRAM to run. However, VistaDream needs to preload multiple models, which demands additional memory. For more details, please refer to the discussion in section A.1.2 of our paper. I will try to release a Colab demo as soon as possible.

@hashnimo
Copy link

hashnimo commented Oct 24, 2024

It doesn't report any errors, interrupts directly, and there are no issues with video memory or memory overflow

Same here:

Loading checkpoint shards: 100% 4/4 [00:08<00:00,  2.20s/it]
^C

I think it's because of this and that. It seems like the Colab system is mistakenly detecting the Fooocus files in this repo as a web UI, causing it to interrupt without providing any information, which is confusing.

I can also confirm that Colab's Nvidia T4 with 16GB of memory is still not enough to run this, even if it doesn't get interrupted (it requires about 17GB of memory). Therefore, some memory optimization will need to be included in the code.

Some suggestions (even though I don't know much about coding):

  • Find a smaller model than juggernautXL_v8Rundiffusion.safetensors
  • Try CPU offloading like pipe.enable_model_cpu_offload()

@HpWang-whu
Copy link
Contributor

@hashnimo Thanks for your suggestion! I will have a try!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants