Complementary composite minimization, small gradients in general norms, and applications
Abstract
Composite minimization is a powerful framework in large-scale convex optimization, based on decoupling of the objective function into terms with structurally different properties and allowing for more flexible algorithmic design. We introduce a new algorithmic framework for complementary composite minimization, where the objective function decouples into a (weakly) smooth and a uniformly convex term. This particular form of decoupling is pervasive in statistics and machine learning, due to its link to regularization. The main contributions of our work are summarized as follows. First, we introduce the problem of complementary composite minimization in general normed spaces; second, we provide a unified accelerated algorithmic framework to address broad classes of complementary composite minimization problems; and third, we prove that the algorithms resulting from our framework are near-optimal in most of the standard optimization settings. Additionally, we show that our algorithmic framework can be used to address the problem of making the gradients small in general normed spaces. As a concrete example, we obtain a nearly-optimal method for the standard ?
Más información
| Título según WOS: | Complementary composite minimization, small gradients in general norms, and applications |
| Título según SCOPUS: | Complementary composite minimization, small gradients in general norms, and applications |
| Título de la Revista: | Mathematical Programming |
| Volumen: | 208 |
| Número: | 1-2 |
| Editorial: | Springer Science and Business Media Deutschland GmbH |
| Fecha de publicación: | 2024 |
| Página de inicio: | 319 |
| Página final: | 363 |
| Idioma: | English |
| DOI: |
10.1007/s10107-023-02040-5 |
| Notas: | ISI, SCOPUS |