New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures PDF Author: Leandro Pardo
Publisher: MDPI
ISBN: 3038979368
Category : Social Science
Languages : en
Pages : 344

Get Book Here

Book Description
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures PDF Author: Leandro Pardo
Publisher: MDPI
ISBN: 3038979368
Category : Social Science
Languages : en
Pages : 344

Get Book Here

Book Description
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures PDF Author: Leandro Pardo
Publisher:
ISBN: 9783038979371
Category : Social sciences (General)
Languages : en
Pages : 344

Get Book Here

Book Description
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Statistical Inference Based on Divergence Measures

Statistical Inference Based on Divergence Measures PDF Author: Leandro Pardo
Publisher: CRC Press
ISBN: 1420034812
Category : Mathematics
Languages : en
Pages : 513

Get Book Here

Book Description
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this p

Concepts and Recent Advances in Generalized Information Measures and Statistics

Concepts and Recent Advances in Generalized Information Measures and Statistics PDF Author: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Publisher: Bentham Science Publishers
ISBN: 1608057607
Category : Science
Languages : en
Pages : 432

Get Book Here

Book Description
Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantifiers are powerful tools for the study of general time and data series independently of their sources, this book will be useful to all those doing research connected with information analysis. The tutorials in this volume are written at a broadly accessible level and readers will have the opportunity to acquire the knowledge necessary to use the information theory tools in their field of interest.

Statistical Inference Based on Divergence Measures

Statistical Inference Based on Divergence Measures PDF Author: Leandro Pardo
Publisher: Chapman and Hall/CRC
ISBN: 9781584886006
Category : Mathematics
Languages : en
Pages : 512

Get Book Here

Book Description
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach. Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions. Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.

Entropy and Information Theory

Entropy and Information Theory PDF Author: Robert M. Gray
Publisher: Springer Science & Business Media
ISBN: 1441979700
Category : Technology & Engineering
Languages : en
Pages : 430

Get Book Here

Book Description
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Information Theory and Statistics

Information Theory and Statistics PDF Author: Solomon Kullback
Publisher: Courier Corporation
ISBN: 0486142043
Category : Mathematics
Languages : en
Pages : 436

Get Book Here

Book Description
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

Divergence Measures

Divergence Measures PDF Author: Igal Sason
Publisher: Mdpi AG
ISBN: 9783036543321
Category :
Languages : en
Pages : 256

Get Book Here

Book Description
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled "Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems", includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the Rényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Advances in Imaging and Electron Physics

Advances in Imaging and Electron Physics PDF Author:
Publisher: Academic Press
ISBN: 0080577571
Category : Technology & Engineering
Languages : en
Pages : 313

Get Book Here

Book Description
Advances in Imaging and Electron Physics

Handbook of Pattern Recognition and Computer Vision (5th Edition)

Handbook of Pattern Recognition and Computer Vision (5th Edition) PDF Author: Chi-hau Chen
Publisher: World Scientific
ISBN: 9814656534
Category : Computers
Languages : en
Pages : 582

Get Book Here

Book Description
The book provides an up-to-date and authoritative treatment of pattern recognition and computer vision, with chapters written by leaders in the field. On the basic methods in pattern recognition and computer vision, topics range from statistical pattern recognition to array grammars to projective geometry to skeletonization, and shape and texture measures. Recognition applications include character recognition and document analysis, detection of digital mammograms, remote sensing image fusion, and analysis of functional magnetic resonance imaging data, etc.