In this paper, we present a new framework, called “RYU”, for constructing “safe” regions – specifically, sets that are guaranteed to contain the dual solution of a target optimization problem. Our framework applies to the standard case where the objective function is composed of two components: a closed, proper, convex function with Lipschitz-smooth gradient and another closed, proper, convex function. We show that the RYU framework not only encompasses but also improves upon the state-of-the-art methods proposed over the past decade for this class of optimization problems.
Revised:
Accepted:
Published online:
Thu-Le Tran 1, 2; Clément Elvira 3; Hong-Phuong Dang 3; Cédric Herzet 4

@article{OJMO_2025__6__A8_0, author = {Thu-Le Tran and Cl\'ement Elvira and Hong-Phuong Dang and C\'edric Herzet}, title = {One to beat them all: {{\textquotedblleft}RYU{\textquotedblright}} {\textendash} a unifying framework for the construction of safe balls}, journal = {Open Journal of Mathematical Optimization}, eid = {7}, pages = {1--16}, publisher = {Universit\'e de Montpellier}, volume = {6}, year = {2025}, doi = {10.5802/ojmo.45}, language = {en}, url = {https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.45/} }
TY - JOUR AU - Thu-Le Tran AU - Clément Elvira AU - Hong-Phuong Dang AU - Cédric Herzet TI - One to beat them all: “RYU” – a unifying framework for the construction of safe balls JO - Open Journal of Mathematical Optimization PY - 2025 SP - 1 EP - 16 VL - 6 PB - Université de Montpellier UR - https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.45/ DO - 10.5802/ojmo.45 LA - en ID - OJMO_2025__6__A8_0 ER -
%0 Journal Article %A Thu-Le Tran %A Clément Elvira %A Hong-Phuong Dang %A Cédric Herzet %T One to beat them all: “RYU” – a unifying framework for the construction of safe balls %J Open Journal of Mathematical Optimization %D 2025 %P 1-16 %V 6 %I Université de Montpellier %U https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.45/ %R 10.5802/ojmo.45 %G en %F OJMO_2025__6__A8_0
Thu-Le Tran; Clément Elvira; Hong-Phuong Dang; Cédric Herzet. One to beat them all: “RYU” – a unifying framework for the construction of safe balls. Open Journal of Mathematical Optimization, Volume 6 (2025), article no. 7, 16 p. doi : 10.5802/ojmo.45. https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.45/
[1] Convex Analysis and Monotone Operator Theory in Hilbert Spaces, CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, Springer, 2017, xix+619 pages | DOI | MR | Zbl
[2] First-Order Methods in Optimization, MOS-SIAM Series on Optimization, 25, Society for Industrial and Applied Mathematics; Mathematical Optimization Society, 2017, xii+475 pages | DOI | MR | Zbl
[3] Learning with fenchel-young losses, J. Mach. Learn. Res., Volume 21 (2020), 35, 69 pages | MR | Zbl
[4] Atomic decomposition by basis pursuit, SIAM J. Sci. Comput., Volume 20 (1999) no. 1, pp. 33-61 | DOI | MR | Zbl
[5] An ellipsoid based, two-stage screening test for BPDN, Proceedings of the European Signal Processing Conference (EUSIPCO), IEEE, 2012, pp. 654-658
[6] Safe feature elimination for the Lasso and sparse supervised learning problems, Pac. J. Optim., Volume 8 (2012) no. 4, pp. 667-698 | MR | Zbl
[7] Safe squeezing for antisparse coding, IEEE Trans. Signal Process., Volume 68 (2020), pp. 3252-3265 | DOI | MR | Zbl
[8] Short and squeezed: accelerating the computation of antisparse representations with safe squeezing, Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), IEEE, 2020, pp. 5615-5619
[9] Safe rules for the identification of zeros in the solutions of the slope problem, SIAM J. Math. Data Sci., Volume 5 (2023) no. 1, pp. 147-173 | DOI | MR | Zbl
[10] Mind the duality gap: safer rules for the Lasso, Proceedings of the 32nd International Conference on Machine Learning (ICML), Volume 37, 2015, pp. 333-342
[11] Stable safe screening and structured dictionaries for faster regularization, IEEE Trans. Signal Process., Volume 67 (2019) no. 14, pp. 3756-3769 | MR | Zbl
[12] Screen & relax: accelerating the resolution of Elastic-Net by safe identification of the solution support, Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), IEEE, 2022, pp. 5443-5447
[13] Gather and conquer: Region-based strategies to accelerate safe screening tests, IEEE Trans. Signal Process., Volume 67 (2019) no. 12, pp. 3300-3315 | DOI | MR | Zbl
[14] Region-free safe screening tests for -penalized convex problems, Proceedings of the European Signal Processing Conference (EUSIPCO), 2022, pp. 2061-2065 | DOI
[15] Safe screening tests for Lasso based on firmly non-expansiveness, Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), IEEE, 2016, pp. 4732-4736
[16] An interior-point method for large-scale l1-regularized logistic regression, J. Mach. Learn. Res., Volume 8 (2007), pp. 1519-1555 | MR | Zbl
[17] Safe screening with variational inequalities and its application to lasso, Proceedings of the 31st International Conference on Machine Learning (ICML), Volume 32, 2014, pp. 289-297
[18] Gap safe screening rules for sparsity enforcing penalties, J. Mach. Learn. Res., Volume 18 (2017) no. 1, pp. 4671-4703 | MR | Zbl
[19] Screening Rules and its Complexity for Active Set Identification, J. Convex Anal., Volume 28 (2021) no. 4, pp. 1053-1072 | MR | Zbl
[20] Safe grid search with optimal complexity, Proceedings of the 36th International Conference on Machine Learning (ICML) (Proceedings of Machine Learning Research), Volume 97, 2019, pp. 4771-4780
[21] A safe feature elimination rule for l1-regularized logistic regression, IEEE Trans. Pattern Anal. Mach. Intell., Volume 44 (2022) no. 9, pp. 4544-4554
[22] Beyond gap screening for lasso by exploiting new dual cutting half-spaces, Proceedings of the European Signal Processing Conference (EUSIPCO), 2022, pp. 2056-2060 | DOI
[23] Sequential safe feature elimination rule for l1-regularized regression with kullback–leibler divergence, Neural Netw., Volume 155 (2022), pp. 523-535 | DOI | Zbl
[24] Lasso screening rules via dual polytope projection, J. Mach. Learn. Res., Volume 16 (2015), pp. 1063-1101 | MR | Zbl
[25] A safe screening rule for sparse logistic regression, NIPS’14: Proceedings of the 28th International Conference on Neural Information Processing Systems, Curran Associates, Inc., 2014, pp. 1053-1061
[26] Dynamic Sasvi: Strong safe screening for norm-regularized least squares, NIPS’21: Proceedings of the 35th International Conference on Neural Information Processing Systems, Curran Associates, Inc., 2021, pp. 14645-14655
[27] Regularization and variable selection via the elastic net, J. R. Stat. Soc., Ser. B, Stat. Methodol., Volume 67 (2005) no. 2, pp. 301-320 | DOI | MR | Zbl
Cited by Sources: