| | |
| | |
Stat |
Members: 3645 Articles: 2'506'133 Articles rated: 2609
26 April 2024 |
|
| | | |
|
Article overview
| |
|
Adaptive Sampling Distributed Stochastic Variance Reduced Gradient for Heterogeneous Distributed Datasets | Ilqar Ramazanli
; Han Nguyen
; Hai Pham
; Sashank Reddi
; Barnabas Poczos
; | Date: |
20 Feb 2020 | Abstract: | We study distributed optimization algorithms for minimizing the average of
emph{heterogeneous} functions distributed across several machines with a focus
on communication efficiency. In such settings, naively using the classical
stochastic gradient descent (SGD) or its variants (e.g., SVRG) with a uniform
sampling of machines typically yields poor performance. It often leads to the
dependence of convergence rate on maximum Lipschitz constant of gradients
across the devices. In this paper, we propose a novel emph{adaptive} sampling
of machines specially catered to these settings. Our method relies on an
adaptive estimate of local Lipschitz constants base on the information of past
gradients. We show that the new way improves the dependence of convergence rate
from maximum Lipschitz constant to emph{average} Lipschitz constant across
machines, thereby, significantly accelerating the convergence. Our experiments
demonstrate that our method indeed speeds up the convergence of the standard
SVRG algorithm in heterogeneous environments. | Source: | arXiv, 2002.8528 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |