-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compatible with JuliaGPU #162
Comments
Most optimisation algorithms do not use the GPU. This is not a limitation of Nonconvex but a limitation of the algorithm packages. If you know a package that implements an algorithm taking advantage of the GPU, please open an issue and I can try to wrap it. |
Even if the algorithm doesn't support GPUs, the more common use of GPUs in optimization is in the evaluation of the objective or constraint functions and their gradients. This can be written using GPU operations. Since we use Zygote by default, functions using the GPU arrays should differentiate just fine with autodiff in some cases. In other cases, you may need to define your own gradient function and pass it to Nonconvex. Again, nothing is stopping you from writing your own functions and gradients which use the JuliaGPU ecosystem and pass the functions over to Nonconvex for optimization. In this case, the optimization algorithm will be running on the CPU trying to figure out the next point |
I am closing this issue since it is not really actionable. If you have a specific case that you want to use the GPU for, I will be happy to help with that. In that case, please open another issue. |
Hello!
Is Nonconvex.jl compatible with JuliaGPU to speed up the optimization process?
The text was updated successfully, but these errors were encountered: