Replies: 1 comment 2 replies
-
Hi Leon, Note that there isn’t second order derivatives yet. But if/when MilesCranmer/SymbolicRegression.jl#254 is completed, I think second order diff of expressions might be easier. cheers! |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I am a huge fan of symbolic regression and of your python package. I have a question if the following would make symbolic regression better:
Say you have some data and make a neural network learn the underlying function as a black box model.
This enables us to have gradients of that function.
Now one would be able to use the (zeroth, first, seconds, ...) derivates on which one could use SR.
Wouldn't this help validating the Regression on the regular data as one artificially creates new functions?
When having multiple outputs from the SR algorithm one could internally compare if things mathematically match and add another loss term.
Would be happy to hear your thoughts on this.
Best,
Leon
Beta Was this translation helpful? Give feedback.
All reactions