Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型训练/测试时的显存开销问题。 #1

Open
1214635079 opened this issue Apr 21, 2022 · 1 comment
Open

模型训练/测试时的显存开销问题。 #1

1214635079 opened this issue Apr 21, 2022 · 1 comment

Comments

@1214635079
Copy link

作者你好,非常感谢你的代码。我在训练和测试模型时发现显存的开销非常大,当crop_size设为128时,在3090(24GB显存)上batch_size最高仅能设为2。因此在这种情况下,测试时也会超出显存,即使降低feature的通道数(nl_feas)也依然存在显存开销巨大的问题。我看到您在文章中是用1080Ti跑的实验,想请教一下我的设定是否存在问题?您训练或测试时通常的显存开销是多大?谢谢!

@Zlin0530
Copy link

Zlin0530 commented Dec 5, 2023

作者你好,非常感谢你的代码。我在训练和测试模型时发现显存的开销非常大,当crop_size设为128时,在3090(24GB显存)上batch_size最高仅能设为2。因此在这种情况下,测试时也会超出显存,即使降低feature的通道数(nl_feas)也依然存在显存开销巨大的问题。我看到您在文章中是用1080Ti跑的实验,想请教一下我的设定是否存在问题?您训练或测试时通常的显存开销是多大?谢谢!

I also encountered the same problem, can you solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants