Narrow your search

Library

Odisee (3)

Thomas More Kempen (3)

Thomas More Mechelen (3)

UCLL (3)

VIVES (3)

LUCA School of Arts (2)

ULB (2)

ULiège (2)

VUB (2)

AP (1)

More...

Resource type

book (4)

digital (1)


Language

English (5)


Year
From To Submit

2012 (1)

2007 (3)

1989 (1)

Listing 1 - 5 of 5
Sort by
Information and Complexity in Statistical Modeling
Author:
ISBN: 1281140643 9786611140649 0387688129 0387366105 1441922679 Year: 2007 Publisher: New York, NY : Springer New York : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

No statistical model is "true" or "false," "right" or "wrong"; the models just have varying performance, which can be assessed. The main theme in this book is to teach modeling based on the principle that the objective is to extract the information from data that can be learned with suggested classes of probability models. The intuitive and fundamental concepts of complexity, learnable information, and noise are formalized, which provides a firm information theoretic foundation for statistical modeling. Inspired by Kolmogorov's structure function in the algorithmic theory of complexity, this is accomplished by finding the shortest code length, called the stochastic complexity, with which the data can be encoded when advantage is taken of the models in a suggested class, which amounts to the MDL (Minimum Description Length) principle. The complexity, in turn, breaks up into the shortest code length for the optimal model in a set of models that can be optimally distinguished from the given data and the rest, which defines "noise" as the incompressible part in the data without useful information. Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provide excellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few. The author is an Honorary Doctor and Professor Emeritus of the Technical University of Tampere, Finland, a Fellow of Helsinki Institute for Information Technology, and visiting Professor in the Computer Learning Research Center of University of London, Holloway, England. He is also a Foreign Member of Finland's Academy of Science and Letters, an Associate Editor of IMA Journal of Mathematical Control and Information and of EURASIP Journal on Bioinformatics and Systems Biology. He is also a former Associate Editor of Source Coding of IEEE Transactions on Information Theory. The author is the recipient of the IEEE Information Theory Society's 1993 Richard W. Hamming medal for fundamental contributions to information theory, statistical inference, control theory, and the theory of complexity; the Information Theory Society's Golden Jubilee Award in 1998 for Technological Innovation for inventing Arithmetic Coding; and the 2006 Kolmogorov medal by University of London. He has also received an IBM Corporate Award for the MDL and PMDL Principles in 1991, and two best paper awards.


Book
Stochastic complexity in statistical inquiry
Author:
ISBN: 1283739194 9812385495 9789812385499 9789971508593 9971508591 Year: 1989 Volume: 15 Publisher: Singapore ; Teaneck, N.J. : World Scientific,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book describes how model selection and statistical inference can be founded on the shortest code length for the observed data, called the stochastic complexity. This generalization of the algorithmic complexity not only offers an objective view of statistics, where no prejudiced assumptions of ""true"" data generating distributions are needed, but it also in one stroke leads to calculable expressions in a range of situations of practical interest and links very closely with mainstream statistical theory. The search for the smallest stochastic complexity extends the classical maximum likel


Book
Optimal estimation of parameters
Author:
ISBN: 9781139518505 113951850X 9780511791635 0511791631 1280773960 9781280773969 9781139516648 1139516647 9781107004740 1107004748 1107227194 9786613684738 1139517570 1139514997 1139514075 Year: 2012 Publisher: Cambridge : Cambridge University Press,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book presents a comprehensive and consistent theory of estimation. The framework described leads naturally to a generalized maximum capacity estimator. This approach allows the optimal estimation of real-valued parameters, their number and intervals, as well as providing common ground for explaining the power of these estimators. Beginning with a review of coding and the key properties of information, the author goes on to discuss the techniques of estimation and develops the generalized maximum capacity estimator, based on a new form of Shannon's mutual information and channel capacity. Applications of this powerful technique in hypothesis testing and denoising are described in detail. Offering an original and thought-provoking perspective on estimation theory, Jorma Rissanen's book is of interest to graduate students and researchers in the fields of information theory, probability and statistics, econometrics and finance.


Digital
Information and Complexity in Statistical Modeling
Author:
ISBN: 9780387688121 Year: 2007 Publisher: New York, NY Springer Science+Business Media, LLC

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Information and Complexity in Statistical Modeling
Authors: ---
ISBN: 9780387688121 Year: 2007 Publisher: New York NY Springer New York

Loading...
Export citation

Choose an application

Bookmark

Abstract

No statistical model is "true" or "false," "right" or "wrong"; the models just have varying performance, which can be assessed. The main theme in this book is to teach modeling based on the principle that the objective is to extract the information from data that can be learned with suggested classes of probability models. The intuitive and fundamental concepts of complexity, learnable information, and noise are formalized, which provides a firm information theoretic foundation for statistical modeling. Inspired by Kolmogorov's structure function in the algorithmic theory of complexity, this is accomplished by finding the shortest code length, called the stochastic complexity, with which the data can be encoded when advantage is taken of the models in a suggested class, which amounts to the MDL (Minimum Description Length) principle. The complexity, in turn, breaks up into the shortest code length for the optimal model in a set of models that can be optimally distinguished from the given data and the rest, which defines "noise" as the incompressible part in the data without useful information. Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provide excellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few. The author is an Honorary Doctor and Professor Emeritus of the Technical University of Tampere, Finland, a Fellow of Helsinki Institute for Information Technology, and visiting Professor in the Computer Learning Research Center of University of London, Holloway, England. He is also a Foreign Member of Finland's Academy of Science and Letters, an Associate Editor of IMA Journal of Mathematical Control and Information and of EURASIP Journal on Bioinformatics and Systems Biology. He is also a former Associate Editor of Source Coding of IEEE Transactions on Information Theory. The author is the recipient of the IEEE Information Theory Society's 1993 Richard W. Hamming medal for fundamental contributions to information theory, statistical inference, control theory, and the theory of complexity; the Information Theory Society's Golden Jubilee Award in 1998 for Technological Innovation for inventing Arithmetic Coding; and the 2006 Kolmogorov medal by University of London. He has also received an IBM Corporate Award for the MDL and PMDL Principles in 1991, and two best paper awards.

Listing 1 - 5 of 5
Sort by