| | |
| | |
Stat |
Members: 3645 Articles: 2'504'928 Articles rated: 2609
25 April 2024 |
|
| | | |
|
Article overview
| |
|
Relative Divergence Measures and Information Inequalities | Inder Jeet Taneja
; | Date: |
11 May 2005 | Subject: | Probability MSC-class: 94A17; 26D15 | math.PR | Abstract: | There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s (1951)relative information and Jeffreys (1946) J-divergence, Information radius or Jensen difference divergence measure due to Sibson (1969) also known in the literature. Burbea and Rao (1982) has also found its applications in the literature. Taneja (1995) studied another kind of divergence measure based on arithmetic and geometric means. These three divergence measures bear a good relationship among each other. But there are another measures arising due to J-divergence, JS-divergence and AG-divergence. These measures we call here relative divergence measures or non-symmetric divergence measures. Here our aim is to obtain bounds on symmetric and non-symmetric divergence measures in terms of relative information of type s using properties of Csiszar’s f-divergence. | Source: | arXiv, math.PR/0505204 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |