Information and Complexity in Statistical Modeling: Information Science and Statistics
Autor Jorma Rissanenen Limba Engleză Hardback – 25 ian 2007
Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provide excellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 372.24 lei 43-57 zile | |
Springer – 23 noi 2010 | 372.24 lei 43-57 zile | |
Hardback (1) | 378.83 lei 43-57 zile | |
Springer – 25 ian 2007 | 378.83 lei 43-57 zile |
Din seria Information Science and Statistics
- 24% Preț: 1151.30 lei
- 24% Preț: 1327.60 lei
- 20% Preț: 445.44 lei
- 18% Preț: 1356.47 lei
- 18% Preț: 1094.24 lei
- 18% Preț: 1102.29 lei
- 15% Preț: 634.64 lei
- 20% Preț: 1038.43 lei
- 20% Preț: 1028.73 lei
- 18% Preț: 936.65 lei
- 20% Preț: 637.10 lei
- 18% Preț: 882.49 lei
- 15% Preț: 770.40 lei
- 15% Preț: 579.82 lei
- 20% Preț: 1258.09 lei
- Preț: 386.57 lei
- 20% Preț: 969.26 lei
- 18% Preț: 933.71 lei
- 20% Preț: 328.76 lei
- 18% Preț: 873.85 lei
Preț: 378.83 lei
Nou
Puncte Express: 568
Preț estimativ în valută:
72.50€ • 75.31$ • 60.22£
72.50€ • 75.31$ • 60.22£
Carte tipărită la comandă
Livrare economică 03-17 februarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9780387366104
ISBN-10: 0387366105
Pagini: 142
Ilustrații: VIII, 142 p.
Dimensiuni: 155 x 235 x 14 mm
Greutate: 0.38 kg
Ediția:2007
Editura: Springer
Colecția Springer
Seria Information Science and Statistics
Locul publicării:New York, NY, United States
ISBN-10: 0387366105
Pagini: 142
Ilustrații: VIII, 142 p.
Dimensiuni: 155 x 235 x 14 mm
Greutate: 0.38 kg
Ediția:2007
Editura: Springer
Colecția Springer
Seria Information Science and Statistics
Locul publicării:New York, NY, United States
Public țintă
Professional/practitionerCuprins
Information and Coding.- Shannon-Wiener Information.- Coding of Random Processes.- Statistical Modeling.- Kolmogorov Complexity.- Stochastic Complexity.- Structure Function.- Optimally Distinguishable Models.- The MDL Principle.- Applications.
Recenzii
From the reviews:
"Readership: Graduate students and researchers in statistics, computer science and engineering, anyone interested in statistical modelling. This book presents a personal introduction to statistical modelling based on the principle that the objective of modelling is to extract learnable information from data with suggested classes of probability models. It grew from lectures to doctoral students … and retains much of the economical style of a lecture series. … Therefore, this fascinating volume offers an excellent source of important statistical research problems calling for solution." (Erkki P. Liski, International Statistical Review, Vol. 75 (2), 2007)
"This book covers the minimum description length (MDL) principle … . For statistics beginners, this book is self-contained. The writing style is concise … . Overall, this is an authoritative source on MDL and a good reference book. Most statisticians would be fortunate to have a copy in their bookshelves." (Thomas C. M. Lee, Journal of the American Statistical Association, Vol. 103 (483), September, 2008)
"This book describes the latest developments of the MDL principle. … The book … is intended to serve as a readable introduction to the mathematical aspects of the MDL principle when applied to statistical modeling for graduate students in statistics and information sciences. … Overall, this interesting book will make an important contribution to the field of statistical modeling through the MDL principle." (Prasanna Sahoo, Zentralblatt Math, Vol. 1156, 2009)
"Readership: Graduate students and researchers in statistics, computer science and engineering, anyone interested in statistical modelling. This book presents a personal introduction to statistical modelling based on the principle that the objective of modelling is to extract learnable information from data with suggested classes of probability models. It grew from lectures to doctoral students … and retains much of the economical style of a lecture series. … Therefore, this fascinating volume offers an excellent source of important statistical research problems calling for solution." (Erkki P. Liski, International Statistical Review, Vol. 75 (2), 2007)
"This book covers the minimum description length (MDL) principle … . For statistics beginners, this book is self-contained. The writing style is concise … . Overall, this is an authoritative source on MDL and a good reference book. Most statisticians would be fortunate to have a copy in their bookshelves." (Thomas C. M. Lee, Journal of the American Statistical Association, Vol. 103 (483), September, 2008)
"This book describes the latest developments of the MDL principle. … The book … is intended to serve as a readable introduction to the mathematical aspects of the MDL principle when applied to statistical modeling for graduate students in statistics and information sciences. … Overall, this interesting book will make an important contribution to the field of statistical modeling through the MDL principle." (Prasanna Sahoo, Zentralblatt Math, Vol. 1156, 2009)
Textul de pe ultima copertă
No statistical model is "true" or "false," "right" or "wrong"; the models just have varying performance, which can be assessed. The main theme in this book is to teach modeling based on the principle that the objective is to extract the information from data that can be learned with suggested classes of probability models. The intuitive and fundamental concepts of complexity, learnable information, and noise are formalized, which provides a firm information theoretic foundation for statistical modeling. Inspired by Kolmogorov's structure function in the algorithmic theory of complexity, this is accomplished by finding the shortest code length, called the stochastic complexity, with which the data can be encoded when advantage is taken of the models in a suggested class, which amounts to the MDL (Minimum Description Length) principle. The complexity, in turn, breaks up into the shortest code length for the optimal model in a set of models that can be optimally distinguished from the given data and the rest, which defines "noise" as the incompressible part in the data without useful information.
Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provide excellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few.
The author is an Honorary Doctor and Professor Emeritus of the Technical University of Tampere, Finland, a Fellow of Helsinki Institute for Information Technology, and visiting Professor in the
Computer Learning Research Center of University of London, Holloway, England. He is also a Foreign Member of Finland's Academy of Science and Letters, an Associate Editor of IMA Journal of Mathematical Control and Information and of EURASIP Journal on Bioinformatics and Systems Biology. He is also a former Associate Editor of Source Coding of IEEE Transactions on Information Theory.
The author is the recipient of the IEEE Information Theory Society's 1993 Richard W. Hamming medal for fundamental contributions to information theory, statistical inference, control theory, and the theory of complexity; the Information Theory Society's Golden Jubilee Award in 1998 for Technological Innovation for inventing Arithmetic Coding; and the 2006 Kolmogorov medal by University of London. He has also received an IBM Corporate Award for the MDL and PMDL Principles in 1991, and two best paper awards.
Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provide excellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few.
The author is an Honorary Doctor and Professor Emeritus of the Technical University of Tampere, Finland, a Fellow of Helsinki Institute for Information Technology, and visiting Professor in the
Computer Learning Research Center of University of London, Holloway, England. He is also a Foreign Member of Finland's Academy of Science and Letters, an Associate Editor of IMA Journal of Mathematical Control and Information and of EURASIP Journal on Bioinformatics and Systems Biology. He is also a former Associate Editor of Source Coding of IEEE Transactions on Information Theory.
The author is the recipient of the IEEE Information Theory Society's 1993 Richard W. Hamming medal for fundamental contributions to information theory, statistical inference, control theory, and the theory of complexity; the Information Theory Society's Golden Jubilee Award in 1998 for Technological Innovation for inventing Arithmetic Coding; and the 2006 Kolmogorov medal by University of London. He has also received an IBM Corporate Award for the MDL and PMDL Principles in 1991, and two best paper awards.
Caracteristici
The author is a distinguished scientist in information theory and statistical modeling