Newtonian Property of Subgradient Method with Optimization of Metric Matrix Parameter Correction : научное издание

Описание

Тип публикации: статья из журнала

Год издания: 2024

Идентификатор DOI: 10.3390/math12111618

Ключевые слова: minimization, subgradient method, convergence rate

Аннотация: <jats:p>The work proves that under conditions of instability of the second derivatives of the function in the minimization region, the estimate of the convergence rate of Newton's method is determined by the parameters of the irreducible part of the conditionality degree of the problem. These parameters represent the degree of diffПоказать полностьюerence between eigenvalues of the matrices of the second derivatives in the coordinate system, where this difference is minimal, and the resulting estimate of the convergence rate subsequently acts as a standard. The paper studies the convergence rate of the relaxation subgradient method (RSM) with optimization of the parameters of two-rank correction of metric matrices on smooth strongly convex functions with a Lipschitz gradient without assumptions about the existence of second derivatives of the function. The considered RSM is similar in structure to quasi-Newton minimization methods. Unlike the latter, its metric matrix is not an approximation of the inverse matrix of second derivatives but is adjusted in such a way that it enables one to find the descent direction that takes the method beyond a certain neighborhood of the current minimum as a result of one-dimensional minimization along it. This means that the metric matrix enables one to turn the current gradient into a direction that is gradient-consistent with the set of gradients of some neighborhood of the current minimum. Under broad assumptions on the parameters of transformations of metric matrices, an estimate of the convergence rate of the studied RSM and an estimate of its ability to exclude removable linear background are obtained. The obtained estimates turn out to be qualitatively similar to estimates for Newton's method. In this case, the assumption of the existence of second derivatives of the function is not required. A computational experiment was carried out in which the quasi-Newton BFGS method and the subgradient method under study were compared on various types of smooth functions. The testing results indicate the effectiveness of the subgradient method in minimizing smooth functions with a high degree of conditionality of the problem and its ability to eliminate the linear background that worsens the convergence.</jats:p>

Ссылки на полный текст

Издание

Журнал: Mathematics

Выпуск журнала: Т. 12, 11

Номера страниц: 1618

ISSN журнала: 22277390

Место издания: Basel

Издатель: MDPI

Персоны

  • Tovbis Elena (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)
  • Krutikov Vladimir (Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, Kemerovo 650043, Russia)
  • Kazakovtsev Lev (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, Krasnoyarsk 660037, Russia)

Вхождение в базы данных