Science-advisor
REGISTER info/FAQ
Login
username
password
     
forgot password?
register here
 
Research articles
  search articles
  reviews guidelines
  reviews
  articles index
My Pages
my alerts
  my messages
  my reviews
  my favorites
 
 
Stat
Members: 3645
Articles: 2'504'928
Articles rated: 2609

25 April 2024
 
  » arxiv » physics/0009032

 Article overview



Information theory and learning: a physical approach
Ilya Nemenman ;
Date 9 Sep 2000
Subject Data Analysis, Statistics and Probability; Disordered Systems and Neural Networks; Learning; Adaptation and Self-Organizing Systems | physics.data-an cond-mat.dis-nn cs.LG nlin.AO
AbstractWe try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define {em predictive information} as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory - as well as in dynamical systems and statistical mechanics - emerge from this universally definable concept. We then prove that predictive information provides the {em unique measure for the complexity} of dynamics underlying the time series and show that there are classes of models characterized by {em power-law growth of the predictive information} that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the `Occam’ factors that punish overly complex models and thus allow one {em to learn not only a solution within a specific model class, but also the class itself} using the data only and with very few a priori assumptions. We study a possible {em information theoretic method} that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.
Source arXiv, physics/0009032
Services Forum | Review | PDF | Favorites   
 
Visitor rating: did you like this article? no 1   2   3   4   5   yes

No review found.
 Did you like this article?

This article or document is ...
important:
of broad interest:
readable:
new:
correct:
Global appreciation:

  Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.

browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)






ScienXe.org
» my Online CV
» Free


News, job offers and information for researchers and scientists:
home  |  contact  |  terms of use  |  sitemap
Copyright © 2005-2024 - Scimetrica