| | |
| | |
Stat |
Members: 3643 Articles: 2'487'895 Articles rated: 2609
28 March 2024 |
|
| | | |
|
Article overview
| |
|
PROPS: Probabilistic personalization of black-box sequence models | Michael Thomas Wojnowicz
; Xuan Zhao
; | Date: |
5 Mar 2019 | Abstract: | We present PROPS, a lightweight transfer learning mechanism for sequential
data. PROPS learns probabilistic perturbations around the predictions of one or
more arbitrarily complex, pre-trained black box models (such as recurrent
neural networks). The technique pins the black-box prediction functions to
"source nodes" of a hidden Markov model (HMM), and uses the remaining nodes as
"perturbation nodes" for learning customized perturbations around those
predictions. In this paper, we describe the PROPS model, provide an algorithm
for online learning of its parameters, and demonstrate the consistency of this
estimation. We also explore the utility of PROPS in the context of personalized
language modeling. In particular, we construct a baseline language model by
training a LSTM on the entire Wikipedia corpus of 2.5 million articles (around
6.6 billion words), and then use PROPS to provide lightweight customization
into a personalized language model of President Donald J. Trump’s tweeting. We
achieved good customization after only 2,000 additional words, and find that
the PROPS model, being fully probabilistic, provides insight into when
President Trump’s speech departs from generic patterns in the Wikipedia corpus.
Python code (for both the PROPS training algorithm as well as experiment
reproducibility) is available at
this https URL | Source: | arXiv, 1903.2013 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser claudebot
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |