A partially derivative-free cyclic block coordinate descent method for nonseparable composite optimization
DOI: https://doi.org/10.3846/mma.2025.23064Abstract
In this paper, we address a composite optimization problem in which the objective function consists of two terms: the first presents a function with a gradient that satisfies a Lipschitz–Hölder composition, while the second one is a convex function. Under general settings, we propose and analyze a new coordinate descent method that can operate without the use of derivatives. The algorithm is an adaptation of the coordinate proximal gradient method, specifically designed to consider the composite form of the objective function. We perform a complete worst-case complexity analysis, assuming that the coordinates (or blocks of coordinates) are selected in a cyclic manner. In addition, we present academic numerical examples that illustrate the efficiency of our algorithm in practical problems.
Keywords:
coordinate descent methods, worst-case evaluation complexity, composite minimization, nonseparable objective functionHow to Cite
Share
License
Copyright (c) 2025 The Author(s). Published by Vilnius Gediminas Technical University.
This work is licensed under a Creative Commons Attribution 4.0 International License.
References
V.S. Amaral, R. Andreani, E.J.G. Birgin, D.S. Marcondes and J.M. Martínez. On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimizations. J. Global Optim., 84(3):527–561, 2022. https://doi.org/10.1007/s10898-022-01168-6
A. Beck and L. Tetruashvili. On the convergence of block coordinate descent type methods. SIAM J. Optim., 23(4):2037–2060, 2013. https://doi.org/10.1137/120887679
A. Bernigaud, S. Gratton and E. Simon. A non-linear conjugate gradient in dual space for Lp-norm regularized non-linear least squares with application in data assimilation. Numer. Algorithms, 95:471–497, 2024. https://doi.org/10.1007/s11075-023-01578-x
E.G. Birgin and J.M. Martínez. Block coordinate descent for smooth nonconvex constrained minimization. Computational Optimization and Applications, 83:1– 27, 2022. https://doi.org/10.1007/s10589-022-00389-5
F. Chorobura and I. Necoara. Random coordinate descent methods for nonseparable composite optimization. SIAM J. Optim., 33(3):2160–2190, 2023. https://doi.org/10.1137/22M148700X
G.N. Grapiglia. Quadratic regularization methods with finite-difference gradient approximations. Comput. Optim. Appl., 85:683–703, 2022. https://doi.org/10.1007/s10589-022-00373-z
G.N. Grapiglia. Worst-case evaluation complexity of a derivativefree quadratic regularization method. Optim Lett, 18:195–213, 2024. https://doi.org/10.1007/s11590-023-01984-z
R. Lopes, S.A. Santos and P.J.S. Silva. Accelerating block coordinate descent methods with identification strategies. Comput. Optim. Appl., 72:609–640, 2019. https://doi.org/10.1007/s10589-018-00056-8
L.F. Maia and D.H. Gutman. The randomized block coordinate descent method in the hölder smooth setting. Optim Lett, 2024. https://doi.org/10.1007/s11590024-02161-6
J.M. Martínez. On high-order model regularization for constrained optimization. SIAM J. Optim., 27(4):2447–2458, 2017. https://doi.org/10.1137/17M1115472
S.J. Wright. Coordinate descent algorithms. Mathematical Programming, 151(1):3–34, 2015. https://doi.org/10.1007/s10107-015-0892-3
M. Yashtini. On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients. Optim. Lett., 10(6):1361–1370, 2016. https://doi.org/10.1007/s11590-015-0936-x
View article in other formats
Published
Issue
Section
Copyright
Copyright (c) 2025 The Author(s). Published by Vilnius Gediminas Technical University.
License
This work is licensed under a Creative Commons Attribution 4.0 International License.