| | |
| | |
Stat |
Members: 3643 Articles: 2'487'895 Articles rated: 2609
29 March 2024 |
|
| | | |
|
Article overview
| |
|
Factorization of Language Models through Backing-Off Lattices | Wei Wang
; | Date: |
23 May 2003 | Subject: | Computation and Language ACM-class: I.2.7 | cs.CL | Abstract: | Factorization of statistical language models is the task that we resolve the most discriminative model into factored models and determine a new model by combining them so as to provide better estimate. Most of previous works mainly focus on factorizing models of sequential events, each of which allows only one factorization manner. To enable parallel factorization, which allows a model event to be resolved in more than one ways at the same time, we propose a general framework, where we adopt a backing-off lattice to reflect parallel factorizations and to define the paths along which a model is resolved into factored models, we use a mixture model to combine parallel paths in the lattice, and generalize Katz’s backing-off method to integrate all the mixture models got by traversing the entire lattice. Based on this framework, we formulate two types of model factorizations that are used in natural language modeling. | Source: | arXiv, cs.CL/0305041 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser claudebot
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |