Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NeuralTangentKernel Loss #506

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

Parvfect
Copy link

@Parvfect Parvfect commented Apr 5, 2022

Adapted from paper - https://arxiv.org/pdf/2007.14527.pdf
Treating Issue #501

Issues

  1. Still not completely sure if values used in struct are to be calculated in the struct or somewhere else
  2. Relies on ForwardDiff.compute_jacobian
  3. Tests and dimension check
  4. Not completely clear on how reweighing takes place

Adapted from paper - https://arxiv.org/pdf/2007.14527.pdf

Issues 
1. Still not completely sure if values used in struct are to be calculated in the struct or somewhere else 
2. Relies on ForwardDiff.compute_jacobian
3. Tests and dimension check 
4. Not completely clear on how reweighing takes place
@ChrisRackauckas ChrisRackauckas requested a review from zoemcc April 5, 2022 12:18
@ChrisRackauckas
Copy link
Member

Relies on ForwardDiff.compute_jacobian

That's fine for now. The literature doesn't sem to say how to optimize this more, even though it's quite obvious it can be improved. Seems like a good paper idea, but for now just do it by the book.

Not completely clear on how reweighing takes place

Feel free to ask questions on the Slack! Zoe should be around?

Tests and dimension check

This is essential. See the other adaptive reweighing tests

@Parvfect
Copy link
Author

Parvfect commented Apr 5, 2022

Haven't been able to get help from Zoe, I'll try Slack again later.

I'll go through the tests that are written already for other adaptive losses. Is it worth reproducing results from the paper?

@ChrisRackauckas
Copy link
Member

I'll go through the tests that are written already for other adaptive losses. Is it worth reproducing results from the paper?

It would be good to try. While it might be too expensive to be a unit test, it would make a good tutorial.

@zoemcc
Copy link
Contributor

zoemcc commented Apr 29, 2022

I'm working on a project deadline for this Tuesday. I'll go through and add more detailed comments after that.

@ChrisRackauckas
Copy link
Member

This should get rebased due to changes in #553. That would hopefully make it much cleaner too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants