A Julia package for non-smooth optimization algorithms.
This package provides algorithms for the minimization of objective functions that include non-smooth terms, such as constraints or non-differentiable penalties. Implemented algorithms include:
- (Fast) Proximal gradient methods
- Douglas-Rachford splitting
- Three-term splitting
- Primal-dual splitting algorithms
- Newton-type methods
Check out this section for an overview of the available algorithms.
Algorithms rely on:
- DifferentiationInterface.jl for automatic differentiation (but you can easily bring your own gradients)
- the ProximalCore API for proximal mappings, projections, etc, to handle non-differentiable terms (see for example ProximalOperators for an extensive collection of functions).
Stable version (latest release)
Development version (master
branch)
If you use any of the algorithms from ProximalAlgorithms in your research, you are kindly asked to cite the relevant bibliography. Please check this section of the manual for algorithm-specific references.
Contributions are welcome in the form of issues notification or pull requests. We recommend looking at already implemented algorithms to get inspiration on how to structure new ones.