Science-advisor
REGISTER info/FAQ
Login
username
password
     
forgot password?
register here
 
Research articles
  search articles
  reviews guidelines
  reviews
  articles index
My Pages
my alerts
  my messages
  my reviews
  my favorites
 
 
Stat
Members: 3645
Articles: 2'501'711
Articles rated: 2609

19 April 2024
 
  » arxiv » math.PR/0505238

 Article overview


Bounds On Triangular Discrimination, Harmonic Mean and Symmetric Chi-square Divergences
Inder Jeet Taneja ;
Date 12 May 2005
Subject Probability MSC-class: 94A17; 26D15 | math.PR
AbstractThere are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber relative information and Jeffreys J-divergence. The measures like, Bhattacharya distance, Hellinger discrimination, Chi-square divergence, triangular discrimination and harmonic mean divergence are also famous in the literature on statistics. In this paper we have obtained bounds on triangular discrimination and symmetric chi-square divergence in terms of relative information of type s using Csiszar’s f-divergence. A relationship among triangular discrimination and harmonic mean divergence is also given.
Source arXiv, math.PR/0505238
Services Forum | Review | PDF | Favorites   
 
Visitor rating: did you like this article? no 1   2   3   4   5   yes

No review found.
 Did you like this article?

This article or document is ...
important:
of broad interest:
readable:
new:
correct:
Global appreciation:

  Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.

browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)






ScienXe.org
» my Online CV
» Free


News, job offers and information for researchers and scientists:
home  |  contact  |  terms of use  |  sitemap
Copyright © 2005-2024 - Scimetrica