Cantitate/Preț
Produs

Statistical Inference Based on Divergence Measures: Statistics: A Series of Textbooks and Monographs

Autor Leandro Pardo
en Limba Engleză Paperback – 30 iun 2020
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.

Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions.

Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.
Citește tot Restrânge

Din seria Statistics: A Series of Textbooks and Monographs

Preț: 31065 lei

Preț vechi: 35662 lei
-13% Nou

Puncte Express: 466

Preț estimativ în valută:
5946 6197$ 4950£

Carte tipărită la comandă

Livrare economică 04-18 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9780367578015
ISBN-10: 0367578018
Pagini: 512
Dimensiuni: 156 x 234 mm
Greutate: 0.45 kg
Ediția:1
Editura: CRC Press
Colecția Chapman and Hall/CRC
Seria Statistics: A Series of Textbooks and Monographs


Public țintă

Professional

Cuprins

Divergence Measures: Definition and Properties. Entropy as a Measure of Diversity: Sampling Distributions. Goodness of Fit Based on Phi-Divergence Statistics: Simple Null Hypothesis. Optimality of Phi -Divergence Test Statistics in Goodness-of-Fit. Minimum Phi -Divergence Estimators. Goodness-of-Fit based on Phi -Divergence Statistics: Composite Null Hypothesis. Testing Loglinear Models using Phi -Divergence Test Statistics. Phi -Divergence Measures in Contingency Tables. Testing in General Populations. References.

Recenzii

"There are a number of measures of divergence between distributions. Describing them properly requires a very mathematically well-written book, which the author here provides … This book is a fine course text, and is beautifully produced. There are about four hundred references. Recommended."
-ISI Short Book Reviews
". . . suitable for a beginning graduate course on information theory based on statistical inference. This book will be a useful and important addition to the resources of practitioners and many others engaged information theory and statistics. Overall, this is an impressive book on information theory based statistical inference."
– Prasanna Sahoo, in Zentralblatt Math, 2008, Vol. 1120

Descriere

Organized in systematic way, Statistical Inference Based on Divergence Measures presents classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence with applications to multinomial and generation populations. On the basis of divergence measures, this book introduces min