Reduce the running-time with custom objective #369
Unanswered
unary-code
asked this question in
Q&A
Replies: 1 comment
-
Since you are doing a lot of editing Julia code, it might even make sense to work directly in Julia. At least until you speed things up and can move back to Python. As it happens I just released an improved API for the backend which makes it much easier to use via Julia sklearn equivalent: MLJ. More info here: https://github.com/MilesCranmer/SymbolicRegression.jl/#mlj-interface Example: import SymbolicRegression: SRRegressor
import MLJ: machine, fit!, predict, report
data = (x=randn(100), y=randn(100))
y = @. 2 * cos(data.x * 12) + data.y ^ 2 - 2
# This also works: data = randn(100, 2)
model = SRRegressor(
niterations=100,
binary_operators=[+, *, /, -],
unary_operators=[exp, cos],
parallelism=:multithreading,
)
mach = machine(model, data, y)
# Train model
fit!(mach)
# View expressions
report(mach) Just note that the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
Here is my custom objective function:
Here is how I defined my model:
First, I verified that this objective function works correctly using the julia helper object to evaluate this objective function on two example trees, given one example dataset X.
Then, I ran model.fit(X, y):
However, it's been over 25 minutes and the only thing printed out to my terminal is this (which was printed like 1 or 2 minutes after I ran "model.fit(X, y)"):
On my past runnings of model.fit(X, y), it did finish within 5 minutes or less.
I think the only thing I changed was that I changed "integral = 1" to "global integral = 1", which means that now, I am actually updating the value of "integral" (and "norm_constant") correctly.
Before, the value of "integral" (and "norm_constant") was always 1, because I was not actually updating the value of integral.
So, my main question is:
Beta Was this translation helpful? Give feedback.
All reactions