-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The implementation of ReLU in YuGCN #9
Comments
Most of the tutorial I found uses AlexNet as the pretrain model so I would consider refactor the code closer to it: |
Sorry I am not sure to follow ? Instead of having an independent hook function, you would like to integrate it directly as a custom relu layer ? |
I actually don't know if I have the right idea to describe the problem correctly. I am not integrating the hook function. In all pytorch based code I can find, they look for a relu module in the model and apply the hook function. In the current implementation of YuGCN, it cannot find the relu module because it's simply not there |
I found this tutorial using I am going to try it out |
It is hard to find online documentation on this type of visualization methods, I always hit stuff about digital marketing... |
In the current implementation ReLU is called as a function after each convolution layer.
The guided back-propagation tutorial I can find online are applying the hook function when detecting the ReLU function implemented as a module.
I am not sure what would be the right way to modify YuGCN to make this process easier. cc @ltetrel
The text was updated successfully, but these errors were encountered: