Skip to content

Strange behaviour when using custom loss function #342

Answered by MilesCranmer
TadeuNP asked this question in Q&A
Discussion options

You must be logged in to vote

Two quick comments:

  1. A simpler way to define the concat array in your code would be:
concat = fill(5f-4, 3, 1)

(which creates an array with shape (3, 1) and value $5 \times 10^{-4}$)

  1. The second output of eval_tree_array is important, as it tells you when the evaluation encounters a NaN or not. So I would do:
p, completed = eval_tree_array(tree, concat, options)

penalty = (!completed || p[1] >= 50) ? 1.5 : 1.0

You should also check the first flag and return a large penalty if it is false.

  1. These flags should return a constant large value, rather than a multiplicative value against the predictive loss. Because if there is a NaN, your prediction loss might be NaN, which could be bad. So …

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@TadeuNP
Comment options

Answer selected by TadeuNP
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
PySR PySR-related discussion SymbolicRegression.jl SymbolicRegression.jl-related discussion
2 participants