A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
Open Journal of Mathematical Optimization, Volume 3 (2022), article no. 1, 15 p.

In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov’s accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments.

Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of [21] possibly exploiting strong convexity of the objective function.

Received:
Revised:
Accepted:
Published online:
DOI: 10.5802/ojmo.12
Mathieu Barré 1; Adrien Taylor 1; Francis Bach 1

1 INRIA (Sierra project-team) – Dépt. d’informatique, Ecole normale supérieure, CNRS, PSL Research University, Paris, France
License: CC-BY 4.0
Copyrights: The authors retain unrestricted copyrights and publishing rights
@article{OJMO_2022__3__A1_0,
     author = {Mathieu Barr\'e and Adrien Taylor and Francis Bach},
     title = {A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives},
     journal = {Open Journal of Mathematical Optimization},
     eid = {1},
     pages = {1--15},
     publisher = {Universit\'e de Montpellier},
     volume = {3},
     year = {2022},
     doi = {10.5802/ojmo.12},
     language = {en},
     url = {https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/}
}
TY  - JOUR
AU  - Mathieu Barré
AU  - Adrien Taylor
AU  - Francis Bach
TI  - A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
JO  - Open Journal of Mathematical Optimization
PY  - 2022
SP  - 1
EP  - 15
VL  - 3
PB  - Université de Montpellier
UR  - https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/
DO  - 10.5802/ojmo.12
LA  - en
ID  - OJMO_2022__3__A1_0
ER  - 
%0 Journal Article
%A Mathieu Barré
%A Adrien Taylor
%A Francis Bach
%T A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
%J Open Journal of Mathematical Optimization
%D 2022
%P 1-15
%V 3
%I Université de Montpellier
%U https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/
%R 10.5802/ojmo.12
%G en
%F OJMO_2022__3__A1_0
Mathieu Barré; Adrien Taylor; Francis Bach. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, Volume 3 (2022), article  no. 1, 15 p. doi : 10.5802/ojmo.12. https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/

[1] Maicon M. Alves Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods (2021) (https://arxiv.org/abs/2102.02045v1)

[2] Nikhil Bansal; Anupam Gupta Potential-function proofs for gradient methods, Theory Comput., Volume 15 (2019) no. 1, 4, 32 pages | MR | Zbl

[3] Mathieu Barré; Adrien Taylor; Francis Bach Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2020) (https://arxiv.org/abs/2006.06041v2)

[4] Mathieu Barré; Adrien Taylor; Francis Bach Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators (2021) (https://arxiv.org/abs/2006.06041)

[5] Amir Beck; Marc Teboulle A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., Volume 2 (2009) no. 1, pp. 183-202 | DOI | MR | Zbl

[6] Yunier Bello-Cruz; Max L. N. Gonçalves; Nathan Krislock On Inexact Accelerated Proximal Gradient Methods with Relative Error Rules (2020) (https://arxiv.org/abs/2005.03766)

[7] Arne Brøndsted; Ralph T. Rockafellar On the subdifferentiability of convex functions, Proc. Am. Math. Soc., Volume 16 (1965) no. 4, pp. 605-611 | DOI | MR | Zbl

[8] Regina S. Burachik; Alfredo N. Iusem; Benar F. Svaiter Enlargement of monotone operators with applications to variational inequalities, Set-Valued Anal., Volume 5 (1997) no. 2, pp. 159-180 | DOI | MR | Zbl

[9] Regina S. Burachik; Claudia A. Sagastizábal; Benar F. Svaiter ε-Enlargements of maximal monotone operators: Theory and applications, Reformulation: nonsmooth, piecewise smooth, semismooth and smoothing methods, Springer, 1998, pp. 25-43 | Zbl

[10] Regina S. Burachik; Susana Scheimberg; Benar F. Svaiter Robustness of the hybrid extragradient proximal-point algorithm, J. Optim. Theory Appl., Volume 111 (2001) no. 1, pp. 117-136 | DOI | MR | Zbl

[11] Antonin Chambolle; Thomas Pock An introduction to continuous optimization for imaging, Acta Numer., Volume 25 (2016), pp. 161-319 | DOI | MR | Zbl

[12] Chih-Chung Chang; Chih-Jen Lin LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol., Volume 2 (2011), 27, 37 pages (Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm)

[13] Giovanni Chierchia; Emilie Chouzenoux; Patrick L. Combettes; Jean-Christophe Pesquet The Proximity Operator Repository. User’s guide (2020) http://proximity-operator.net/download/guide.pdf

[14] Alexandre d’Aspremont; Damien Scieur; Adrien Taylor Acceleration methods, Foundations and Trends® in Optimization, Volume 5 (2021) no. 1-2, pp. 1-245 | DOI

[15] Osman Güler New proximal point algorithms for convex minimization, SIAM J. Optim., Volume 2 (1992) no. 4, pp. 649-664 | DOI | MR | Zbl

[16] Jean-Baptiste Hiriart-Urruty; Claude Lemaréchal Convex analysis and minimization algorithms I: Fundamentals, Grundlehren der Mathematischen Wissenschaften, 305, Springer, 2013

[17] Rodolphe Jenatton; Julien Mairal; Guillaume Obozinski; Francis Bach, Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML) (2010), pp. 487-494 | Zbl

[18] Kaifeng Jiang; Defeng Sun; Kim-Chuan Toh An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP, SIAM J. Optim., Volume 22 (2012) no. 3, pp. 1042-1064 | DOI | MR | Zbl

[19] Julien Mairal; Rodolphe Jenatton; Guillaume Obozinski; Francis Bach Convex and network flow optimization for structured sparsity, J. Mach. Learn. Res., Volume 12 (2011) no. Sep, pp. 2681-2720 | MR | Zbl

[20] Reinier D. Millán; Majela P. Machado Inexact proximal epsilon-subgradient methods for composite convex optimization problems, J. Glob. Optim., Volume 75 (2019) no. 4, pp. 1029-1060 | DOI | Zbl

[21] Renato D. C. Monteiro; Benar F. Svaiter An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods, SIAM J. Optim., Volume 23 (2013) no. 2, pp. 1092-1125 | DOI | MR | Zbl

[22] Yurii Nesterov A method of solving a convex programming problem with convergence rate O(1/k 2 ), Sov. Math., Dokl., Volume 27 (1983), pp. 372-376

[23] Yurii Nesterov Introductory Lectures on Convex Optimization: a Basic Course, Applied Optimization, Kluwer Academic Publishing, 2004 | DOI

[24] Yurii Nesterov Gradient methods for minimizing composite functions, Math. Program., Volume 140 (2013) no. 1, pp. 125-161 | DOI | MR | Zbl

[25] Neal Parikh; Stephen Boyd Proximal algorithms, Foundations and Trends® in Optimization, Volume 1 (2014) no. 3, pp. 127-239 | DOI

[26] Ralph T. Rockafellar Convex Analysis, Princeton Landmarks in Mathematics, Princeton University Press, 1997

[27] Leonid Rudin; Stanley Osher, Proceedings of 1st International Conference on Image Processing (1994), pp. 31-35 | DOI

[28] Leonid Rudin; Stanley Osher; Emad Fatemi Nonlinear total variation based noise removal algorithms, Physica D, Volume 60 (1992) no. 1-4, pp. 259-268 | DOI | MR | Zbl

[29] Ernest K. Ryu; Stephen Boyd Primer on monotone operator methods, Appl. Comput. Math., Volume 15 (2016) no. 1, pp. 3-43 | MR | Zbl

[30] Saverio Salzo; Silvia Villa Inexact and Accelerated Proximal Point Algorithms, J. Convex Anal., Volume 19 (2012) no. 4, pp. 1167-1192 | MR | Zbl

[31] Mark Schmidt; Nicolas Le Roux; Francis Bach, Advances in neural information processing systems (NIPS) (2011), pp. 1458-1466

[32] Mikhail V. Solodov; Benar F. Svaiter A hybrid approximate extragradient–proximal point algorithm using the enlargement of a maximal monotone operator, Set-Valued Anal., Volume 7 (1999) no. 4, pp. 323-345 | DOI | MR | Zbl

[33] Mikhail V. Solodov; Benar F. Svaiter A comparison of rates of convergence of two inexact proximal point algorithms, Nonlinear optimization and related topics, Springer, 2000, pp. 415-427 | DOI | Zbl

[34] Adrien Taylor; Julien M. Hendrickx; François Glineur Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Math. Program., Volume 161 (2017) no. 1-2, pp. 307-345 | DOI | MR | Zbl

[35] Paul Tseng On accelerated proximal gradient methods for convex-concave optimization (2008) (Technical report)

[36] Silvia Villa; Saverio Salzo; Luca Baldassarre; Alessandro Verri Accelerated and inexact forward-backward algorithms, SIAM J. Optim., Volume 23 (2013) no. 3, pp. 1607-1633 | DOI | MR | Zbl

[37] Yilun Wang; Junfeng Yang; Wotao Yin; Yin Zhang A new alternating minimization algorithm for total variation image reconstruction, SIAM J. Imaging Sci., Volume 1 (2008) no. 3, pp. 248-272 | DOI | MR | Zbl

[38] Ashia C. Wilson; Ben Recht; Michael I. Jordan A Lyapunov Analysis of Accelerated Methods in Optimization, J. Mach. Learn. Res., Volume 22 (2021) no. 113, pp. 1-34 | MR | Zbl

Cited by Sources: