A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
Open Journal of Mathematical Optimization, Volume 3 (2022), article no. 1, 15 p.

In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov’s accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments.

Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of [21] possibly exploiting strong convexity of the objective function.

Received:
Revised:
Accepted:
Published online:
DOI: 10.5802/ojmo.12
Mathieu Barré 1; Adrien Taylor 1; Francis Bach 1

1 INRIA (Sierra project-team) – Dépt. d’informatique, Ecole normale supérieure, CNRS, PSL Research University, Paris, France
@article{OJMO_2022__3__A1_0,
     author = {Mathieu Barr\'e and Adrien Taylor and Francis Bach},
     title = {A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives},
     journal = {Open Journal of Mathematical Optimization},
     eid = {1},
     publisher = {Universit\'e de Montpellier},
     volume = {3},
     year = {2022},
     doi = {10.5802/ojmo.12},
     language = {en},
     url = {https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/}
}
TY  - JOUR
TI  - A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
JO  - Open Journal of Mathematical Optimization
PY  - 2022
DA  - 2022///
VL  - 3
PB  - Université de Montpellier
UR  - https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/
UR  - https://doi.org/10.5802/ojmo.12
DO  - 10.5802/ojmo.12
LA  - en
ID  - OJMO_2022__3__A1_0
ER  - 
%0 Journal Article
%T A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
%J Open Journal of Mathematical Optimization
%D 2022
%V 3
%I Université de Montpellier
%U https://doi.org/10.5802/ojmo.12
%R 10.5802/ojmo.12
%G en
%F OJMO_2022__3__A1_0
Mathieu Barré; Adrien Taylor; Francis Bach. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, Volume 3 (2022), article  no. 1, 15 p. doi : 10.5802/ojmo.12. https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/

[1] Maicon M. Alves Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods (2021) (https://arxiv.org/abs/2102.02045v1)

[2] Nikhil Bansal; Anupam Gupta Potential-function proofs for gradient methods, Theory Comput., Volume 15 (2019) no. 1, 4, 32 pages | MR: 4003838 | Zbl: 07140481

[3] Mathieu Barré; Adrien Taylor; Francis Bach Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2020) (https://arxiv.org/abs/2006.06041v2)

[4] Mathieu Barré; Adrien Taylor; Francis Bach Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2021) (https://arxiv.org/abs/2006.06041)

[5] Amir Beck; Marc Teboulle A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., Volume 2 (2009) no. 1, pp. 183-202 | Article | MR: 2486527 | Zbl: 1175.94009

[6] Yunier Bello-Cruz; Max L. N. Gonçalves; Nathan Krislock On Inexact Accelerated Proximal Gradient Methods with Relative Error Rules (2020) (https://arxiv.org/abs/2005.03766)

[7] Arne Brøndsted; Ralph T. Rockafellar On the subdifferentiability of convex functions, Proc. Am. Math. Soc., Volume 16 (1965) no. 4, pp. 605-611 | Article | MR: 178103 | Zbl: 0141.11801

[8] Regina S. Burachik; Alfredo N. Iusem; Benar F. Svaiter Enlargement of monotone operators with applications to variational inequalities, Set-Valued Anal., Volume 5 (1997) no. 2, pp. 159-180 | Article | MR: 1463929 | Zbl: 0882.90105

[9] Regina S. Burachik; Claudia A. Sagastizábal; Benar F. Svaiter ε-Enlargements of maximal monotone operators: Theory and applications, Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods, Springer, 1998, pp. 25-43 | Zbl: 0928.65068

[10] Regina S. Burachik; Susana Scheimberg; Benar F. Svaiter Robustness of the hybrid extragradient proximal-point algorithm, J. Optim. Theory Appl., Volume 111 (2001) no. 1, pp. 117-136 | Article | MR: 1850681 | Zbl: 1054.90088

[11] Antonin Chambolle; Thomas Pock An introduction to continuous optimization for imaging, Acta Numer., Volume 25 (2016), pp. 161-319 | Article | MR: 3509209 | Zbl: 1343.65064

[12] Chih-Chung Chang; Chih-Jen Lin LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol., Volume 2 (2011), 27, 37 pages (Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm)

[13] Giovanni Chierchia; Emilie Chouzenoux; Patrick L. Combettes; Jean-Christophe Pesquet The Proximity Operator Repository. User’s guide (2020) http://proximity-operator.net/download/guide.pdf

[14] Alexandre d’Aspremont; Damien Scieur; Adrien Taylor Acceleration methods, Foundations and Trends® in Optimization, Volume 5 (2021) no. 1-2, pp. 1-245 | Article

[15] Osman Güler New proximal point algorithms for convex minimization, SIAM J. Optim., Volume 2 (1992) no. 4, pp. 649-664 | Article | MR: 1186167 | Zbl: 0778.90052

[16] Jean-Baptiste Hiriart-Urruty; Claude Lemaréchal Convex analysis and minimization algorithms I: Fundamentals, Grundlehren der Mathematischen Wissenschaften, 305, Springer, 2013

[17] Rodolphe Jenatton; Julien Mairal; Guillaume Obozinski; Francis Bach Proximal methods for sparse hierarchical dictionary learning, Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML) (2010), pp. 487-494 | Zbl: 1280.94029

[18] Kaifeng Jiang; Defeng Sun; Kim-Chuan Toh An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP, SIAM J. Optim., Volume 22 (2012) no. 3, pp. 1042-1064 | Article | MR: 3023763 | Zbl: 1401.90120

[19] Julien Mairal; Rodolphe Jenatton; Guillaume Obozinski; Francis Bach Convex and network flow optimization for structured sparsity, J. Mach. Learn. Res., Volume 12 (2011) no. Sep, pp. 2681-2720 | MR: 2845676 | Zbl: 1280.68179

[20] Reinier D. Millán; Majela P. Machado Inexact proximal epsilon-subgradient methods for composite convex optimization problems, J. Glob. Optim., Volume 75 (2019) no. 4, pp. 1029-1060 | Article | Zbl: 1461.65176

[21] Renato D. C. Monteiro; Benar F. Svaiter An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim., Volume 23 (2013) no. 2, pp. 1092-1125 | Article | MR: 3063151 | Zbl: 1298.90071

[22] Yurii Nesterov A method of solving a convex programming problem with convergence rate O(1/k 2 ), Sov. Math., Dokl., Volume 27 (1983), pp. 372-376

[23] Yurii Nesterov Introductory Lectures on Convex Optimization: a Basic Course, Applied Optimization, Kluwer Academic Publishing, 2004 | Article

[24] Yurii Nesterov Gradient methods for minimizing composite functions, Math. Program., Volume 140 (2013) no. 1, pp. 125-161 | Article | MR: 3071865 | Zbl: 1287.90067

[25] Neal Parikh; Stephen Boyd Proximal algorithms, Foundations and Trends® in Optimization, Volume 1 (2014) no. 3, pp. 127-239 | Article

[26] Ralph T. Rockafellar Convex Analysis, Princeton Landmarks in Mathematics, Princeton University Press, 1997

[27] Leonid Rudin; Stanley Osher Total variation based image restoration with free local constraints, Proceedings of 1st International Conference on Image Processing (1994), pp. 31-35 | Article

[28] Leonid Rudin; Stanley Osher; Emad Fatemi Nonlinear total variation based noise removal algorithms, Physica D, Volume 60 (1992) no. 1-4, pp. 259-268 | Article | MR: 3363401 | Zbl: 0780.49028

[29] Ernest K. Ryu; Stephen Boyd Primer on monotone operator methods, Appl. Comput. Math., Volume 15 (2016) no. 1, pp. 3-43 | MR: 3467167 | Zbl: 1342.47066

[30] Saverio Salzo; Silvia Villa Inexact and Accelerated Proximal Point Algorithms, J. Convex Anal., Volume 19 (2012) no. 4, pp. 1167-1192 | MR: 3059059 | Zbl: 1283.90030

[31] Mark Schmidt; Nicolas Le Roux; Francis Bach Convergence rates of inexact proximal-gradient methods for convex optimization, Advances in neural information processing systems (NIPS) (2011), pp. 1458-1466

[32] Mikhail V. Solodov; Benar F. Svaiter A hybrid approximate extragradient–proximal point algorithm using the enlargement of a maximal monotone operator, Set-Valued Anal., Volume 7 (1999) no. 4, pp. 323-345 | Article | MR: 1756912 | Zbl: 0959.90038

[33] Mikhail V. Solodov; Benar F. Svaiter A comparison of rates of convergence of two inexact proximal point algorithms, Nonlinear optimization and related topics, Springer, 2000, pp. 415-427 | Article | Zbl: 0986.90081

[34] Adrien Taylor; Julien M. Hendrickx; François Glineur Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Math. Program., Volume 161 (2017) no. 1-2, pp. 307-345 | Article | MR: 3592780 | Zbl: 1359.90098

[35] Paul Tseng On accelerated proximal gradient methods for convex-concave optimization (2008) (Technical report)

[36] Silvia Villa; Saverio Salzo; Luca Baldassarre; Alessandro Verri Accelerated and inexact forward-backward algorithms, SIAM J. Optim., Volume 23 (2013) no. 3, pp. 1607-1633 | Article | MR: 3085126 | Zbl: 1295.90049

[37] Yilun Wang; Junfeng Yang; Wotao Yin; Yin Zhang A new alternating minimization algorithm for total variation image reconstruction, SIAM J. Imaging Sci., Volume 1 (2008) no. 3, pp. 248-272 | Article | MR: 2486032 | Zbl: 1187.68665

[38] Ashia C. Wilson; Ben Recht; Michael I. Jordan A Lyapunov Analysis of Accelerated Methods in Optimization, J. Mach. Learn. Res., Volume 22 (2021) no. 113, pp. 1-34 | MR: 4279764 | Zbl: 07370630

Cited by Sources: