Abstrakti
Various problems arising in control and data analysis can be formulated as large-scale convex optimization problems with a composite objective structure. Within the black-box optimization framework, such problems are typically solved by using accelerated first-order methods. The celebrated examples of such methods are the Fast Gradient Method and the Accelerated Multistep Gradient Method, designed by using the estimating sequences framework. In this work, we present a new class of estimating sequences, which are constructed by making use of a tighter lower bound on the objective function together with the gradient mapping technique. Based on the newly introduced estimating sequences, we construct a new method, which is also equipped with an efficient line-search strategy that provides robustness to the imperfect knowledge of the Lipschitz constant. Our proposed method enjoys the accelerated convergence rate, and our theoretical results are corroborated by numerical experiments conducted on real-world datasets. The experimental results also demonstrate the robustness of the initialization of the proposed method to the imperfect knowledge of the strong convexity parameter of the objective function.
Alkuperäiskieli | Englanti |
---|---|
Otsikko | 2022 IEEE 61st Conference on Decision and Control (CDC) |
Kustantaja | IEEE |
Sivut | 7516-7521 |
Sivumäärä | 6 |
ISBN (elektroninen) | 978-1-6654-6761-2 |
DOI - pysyväislinkit | |
Tila | Julkaistu - 10 tammik. 2023 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisussa |
Tapahtuma | IEEE Conference on Decision and Control - Cancun, Mexico, Cancun, Meksiko Kesto: 6 jouluk. 2022 → 9 jouluk. 2022 Konferenssinumero: 61 |
Julkaisusarja
Nimi | Proceedings of the IEEE Conference on Decision & Control |
---|---|
ISSN (elektroninen) | 2576-2370 |
Conference
Conference | IEEE Conference on Decision and Control |
---|---|
Lyhennettä | CDC |
Maa/Alue | Meksiko |
Kaupunki | Cancun |
Ajanjakso | 06/12/2022 → 09/12/2022 |