Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimization with IPOPT, "memoization" technique seems to be overridden #169

Closed
mranneberg opened this issue Mar 22, 2024 · 7 comments
Closed

Comments

@mranneberg
Copy link

I have, like probably a lot of real world applications, a function that calculates target and equality constraints and gradients of both in one function.
To make this work with the framework I want to use the "memoization trick", that is e.g. for the target function

# Helper function / memoization
function fgh_Target(I,x0)
    last_x = nothing
    L = 0.0
    dL = 0.0
    ddL = 0.0
    function update(x)
        if x != last_x || true # I need to use the "true" for ipopt
            #@info "Updating $x"
            x0[I] = x
            L,dLF,ddLF = eval_lagrange_design(x0)
            last_x = x
            dL = dLF[I]
            ddL = ddLF[I,I]
        end
        return
    end
    function f(x)
        update(x)
        return L
    end
    function ∇f(x)
        update(x)
        return dL
    end
    function Hf(x)
        update(x)
        return ddL
    end
    return f, ∇f, Hf
end

If I do not set the "if" in update(x)to be always true and call f(x) in julia with different arguments, it correctly updates all values for new x. But if I use it for IPOPT like this

    t, ∇t, ∇²t = fgh_Target(I,x,param,windspeed,altitude,Pnom,init_solution,optimize_plant,opt_weights)
    T = CustomHessianFunction(t, ∇t, ∇²t)
    model = Model(T)

it will not update and I have to set the if to be always true.

Before I go about defining a dummy problem and sharing this, does somebody already know what's going on? Maybe some internal optimization for function handles, or a specific behaviour of CustomHessianFunction?

@mohamed82008
Copy link
Member

I will look into it. But would it be easier to define a dummy variable O, make that the objective and then add a new equality constraint O - obj(x) = 0 where obj is the objective function?

@mranneberg
Copy link
Author

I will look into it. But would it be easier to define a dummy variable O, make that the objective and then add a new equality constraint O - obj(x) = 0 where obj is the objective function?

Thanks for the reply! Sorry, I was not very clear. The same thing happens with equality constraint functions, using the memoization "trick" fails for some reason when used with the "Custom..." Function in ipopt. Will make a small example next week.

@mranneberg
Copy link
Author

Added example in #168

@mranneberg
Copy link
Author

Hint:
I tried using IPOPT with NLPModelsIpopt and have exactly the same issue.

@mranneberg
Copy link
Author

mranneberg commented Apr 25, 2024

It works with
if x[:] != last_x[:]
and
last_x .= x

I saw the hints here: jump-dev/Ipopt.jl#34

This is not an issue with nonconvex or nlpmodels, but with the magic of rebinding, overwriting and the obtuse actions that we all do with writing a=b. Closing.

@mohamed82008
Copy link
Member

Great to hear that this is resolved. Sorry I haven't been very responsive because I am in the middle of relocating.

@mranneberg
Copy link
Author

No worries.
It's still weird that the issue only comes up when used within ipopt and not within Julia though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants