| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
19 April 2024 |
|
| | | |
|
Article overview
| |
|
Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences | Inder Jeet Taneja
; | Date: |
12 May 2005 | Subject: | Probability MSC-class: 94A17; 26D15 | math.PR | Abstract: | There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber relative information and Jeffreys J-divergence. The measures like, Bhattacharya distance, Hellinger discrimination, Chi-square divergence, triangular discrimination and harmonic mean divergence are also famous in the literature on statistics. In this paper we have obtained bounds on triangular discrimination and symmetric chi-square divergence in terms of relative information of type s using Csiszar’s f-divergence. A relationship among triangular discrimination and harmonic mean divergence is also given. | Source: | arXiv, math.PR/0505238 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |