| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
14 October 2024 |
|
| | | |
|
Article overview
| |
|
Stochastic Gradient Methods with Preconditioned Updates | Abdurakhmon Sadiev
; Aleksandr Beznosikov
; Abdulla Jasem Almansoori
; Dmitry Kamzolov
; Rachael Tappenden
; Martin Takáč
; | Date: |
1 Jun 2022 | Abstract: | This work considers non-convex finite sum minimization. There are a number of
algorithms for such problems, but existing methods often work poorly when the
problem is badly scaled and/or ill-conditioned, and a primary goal of this work
is to introduce methods that alleviate this issue. Thus, here we include a
preconditioner that is based upon Hutchinson’s approach to approximating the
diagonal of the Hessian, and couple it with several gradient based methods to
give new ’scaled’ algorithms: { t Scaled SARAH} and { t Scaled L-SVRG}.
Theoretical complexity guarantees under smoothness assumptions are presented,
and we prove linear convergence when both smoothness and the PL-condition is
assumed. Because our adaptively scaled methods use approximate partial second
order curvature information, they are better able to mitigate the impact of
badly scaled problems, and this improved practical performance is demonstrated
in the numerical experiments that are also presented in this work. | Source: | arXiv, 2206.00285 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|