Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not updating batch norm parameters? #50

Open
emrahyigit opened this issue Aug 4, 2020 · 1 comment
Open

Why not updating batch norm parameters? #50

emrahyigit opened this issue Aug 4, 2020 · 1 comment

Comments

@emrahyigit
Copy link

According to the following lines, your code doesn't update batch norm parameters. What is the real reason for that?

params_without_bn = [params for name, params in model.named_parameters() if not ('_bn' in name or '.bn' in name)]

@JiyueWang
Copy link

JiyueWang commented Dec 1, 2020

This line means BN layer is excluded for weight decay, not the whole BP. However, I tried the regular way and got similar results on CIFAR100. I think the link below will explain part of it. Looking forward to the official explanation.
https://blog.janestreet.com/l2-regularization-and-batch-norm/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants