| | |
| | |
Stat |
Members: 3645 Articles: 2'504'928 Articles rated: 2609
26 April 2024 |
|
| | | |
|
Article overview
| |
|
Asymptotically Exact, Embarrassingly Parallel MCMC | Willie Neiswanger
; Chong Wang
; Eric Xing
; | Date: |
19 Nov 2013 | Abstract: | Communication costs, resulting from synchronization requirements during
learning, can greatly slow down many parallel machine learning algorithms. In
this paper, we present a parallel Markov chain Monte Carlo (MCMC) algorithm in
which subsets of data are processed independently, with very little
communication. First, we arbitrarily partition data onto multiple machines.
Then, on each machine, any classical MCMC method (e.g., Gibbs sampling) may be
used to draw samples from a posterior distribution given the data subset.
Finally, the samples from each machine are combined to form samples from the
full posterior. This embarrassingly parallel algorithm effectively allows each
machine to act independently on a subset of the data (without communication)
until the final combination stage. We prove that our algorithm generates
asymptotically exact samples and empirically demonstrate its effectiveness on
several large-scale synthetic and real datasets. | Source: | arXiv, 1311.4780 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |