Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Aux large influence the training phase. #17

Open
YLongJin opened this issue Apr 11, 2022 · 2 comments
Open

Aux large influence the training phase. #17

YLongJin opened this issue Apr 11, 2022 · 2 comments

Comments

@YLongJin
Copy link

It is great job.
when I traning my dataset, most of the time,the iou is satirfactory ,but sometimes the aux loss would be very large. I debug the coed
I found the sgn input x which comput to gama would be abnormal. I want to know why it will happend and can you give me some advice.
Thanks very much

@samleoqh
Copy link
Owner

it sounds like the the diagonal entries of A (adj matrix) are very small numbers in your case, then its mean value will be close to zero, thus the gama will be very large. A workaround is to use clamp to setup a maxum value to gama, e.g. adding code after # line #42 in scg_gcn.py

gama = torch.clamp(gama , min=1., max=5.0)

@YLongJin
Copy link
Author

I got it and the problem was solved, the input of the sgn is just numbered, how can I comput and get the min=1 max=5?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants