Proximal algorithms (also known as "splitting" algorithms or methods) for nonsmooth optimization in Julia.
This package can be used in combination with ProximalOperators.jl (providing first-order primitives, i.e. gradient and proximal mapping, for numerous cost functions) and AbstractOperators.jl (providing several linear and nonlinear operators) to formulate and solve a wide spectrum of nonsmooth optimization problems.
StructuredOptimization.jl provides a higher-level interface to formulate and solve problems using (some of) the algorithms here included.
To install the package, simply issue the following command in the Julia REPL:
] add ProximalAlgorithms
Check out these test scripts for examples on how to apply the provided algorithms to problems.
Algorithm | Function | Reference |
---|---|---|
Douglas-Rachford splitting algorithm | DouglasRachford |
[1] |
Forward-backward splitting (i.e. proximal gradient) algorithm | ForwardBackward |
[2], [3] |
Vũ-Condat primal-dual algorithm | VuCondat |
[4], [6], [7] |
Davis-Yin splitting algorithm | DavisYin |
[9] |
Asymmetric forward-backward-adjoint algorithm | AFBA |
[10] |
PANOC (L-BFGS) | PANOC |
[11] |
ZeroFPR (L-BFGS) | ZeroFPR |
[12] |
Douglas-Rachford line-search (L-BFGS) | DRLS |
[13] |
Contributions are welcome in the form of issues notification or pull requests. We recommend looking at already implemented algorithms to get inspiration on how to structure new ones.
[1] Eckstein, Bertsekas, On the Douglas-Rachford Splitting Method and the Proximal Point Algorithm for Maximal Monotone Operators, Mathematical Programming, vol. 55, no. 1, pp. 293-318 (1989).
[2] Tseng, On Accelerated Proximal Gradient Methods for Convex-Concave Optimization (2008).
[3] Beck, Teboulle, A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems, SIAM Journal on Imaging Sciences, vol. 2, no. 1, pp. 183-202 (2009).
[4] Chambolle, Pock, A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging, Journal of Mathematical Imaging and Vision, vol. 40, no. 1, pp. 120-145 (2011).
[5] Boyd, Parikh, Chu, Peleato, Eckstein, Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers, Foundations and Trends in Machine Learning, vol. 3, no. 1, pp. 1-122 (2011).
[6] Vũ, A splitting algorithm for dual monotone inclusions involving cocoercive operators, Advances in Computational Mathematics, vol. 38, no. 3, pp. 667-681 (2013).
[7] Condat, A primal–dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms, Journal of Optimization Theory and Applications, vol. 158, no. 2, pp 460-479 (2013).
[8] Parikh, Boyd, Proximal Algorithms, Foundations and Trends in Optimization, vol. 1, no. 3, pp. 127-239 (2014).
[9] Davis, Yin, A Three-Operator Splitting Scheme and its Optimization Applications, Set-Valued and Variational Analysis, vol. 25, no. 4, pp. 829–858 (2017).
[10] Latafat, Patrinos, Asymmetric forward–backward–adjoint splitting for solving monotone inclusions involving three operators, Computational Optimization and Applications, vol. 68, no. 1, pp. 57-93 (2017).
[11] Stella, Themelis, Sopasakis, Patrinos, A simple and efficient algorithm for nonlinear model predictive control, 56th IEEE Conference on Decision and Control (2017).
[12] Themelis, Stella, Patrinos, Forward-backward envelope for the sum of two nonconvex functions: Further properties and nonmonotone line-search algorithms, SIAM Journal on Optimization, vol. 28, no. 3, pp. 2274–2303 (2018).
[13] Themelis, Stella, Patrinos, Douglas-Rachford splitting and ADMM for nonconvex optimization: Accelerated and Newton-type algorithms, arXiv preprint (2020).