-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimization with IPOPT, "memoization" technique seems to be overridden #169
Comments
I will look into it. But would it be easier to define a dummy variable |
Thanks for the reply! Sorry, I was not very clear. The same thing happens with equality constraint functions, using the memoization "trick" fails for some reason when used with the "Custom..." Function in ipopt. Will make a small example next week. |
Added example in #168 |
Hint: |
It works with I saw the hints here: jump-dev/Ipopt.jl#34 This is not an issue with nonconvex or nlpmodels, but with the magic of rebinding, overwriting and the obtuse actions that we all do with writing a=b. Closing. |
Great to hear that this is resolved. Sorry I haven't been very responsive because I am in the middle of relocating. |
No worries. |
I have, like probably a lot of real world applications, a function that calculates target and equality constraints and gradients of both in one function.
To make this work with the framework I want to use the "memoization trick", that is e.g. for the target function
If I do not set the "if" in
update(x)
to be always true and call f(x) in julia with different arguments, it correctly updates all values for newx
. But if I use it for IPOPT like thisit will not update and I have to set the if to be always true.
Before I go about defining a dummy problem and sharing this, does somebody already know what's going on? Maybe some internal optimization for function handles, or a specific behaviour of CustomHessianFunction?
The text was updated successfully, but these errors were encountered: