Learning with the Minimum Description Length Principle
Autor Kenji Yamanishien Limba Engleză Hardback – 15 sep 2023
Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.
Preț: 802.10 lei
Preț vechi: 1002.62 lei
-20% Nou
Puncte Express: 1203
Preț estimativ în valută:
153.52€ • 160.01$ • 127.80£
153.52€ • 160.01$ • 127.80£
Carte tipărită la comandă
Livrare economică 07-21 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9789819917891
ISBN-10: 9819917891
Pagini: 339
Ilustrații: XX, 339 p. 51 illus., 48 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.68 kg
Ediția:1st ed. 2023
Editura: Springer Nature Singapore
Colecția Springer
Locul publicării:Singapore, Singapore
ISBN-10: 9819917891
Pagini: 339
Ilustrații: XX, 339 p. 51 illus., 48 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.68 kg
Ediția:1st ed. 2023
Editura: Springer Nature Singapore
Colecția Springer
Locul publicării:Singapore, Singapore
Cuprins
Information and Coding.- Parameter Estimation.- Model Selection.- Latent Variable Model Selection.- Sequential Prediction.- MDL Change Detection.- Continuous Model Selection.- Extension of Stochastic Complexity.- Mathematical Preliminaries.
Notă biografică
Kenji Yamanishi is a Professor at the Graduate School of Information Science and Technology, University of Tokyo, Japan. After completing the master course at the Graduate School of University of Tokyo, he joined NEC Corporation in 1987. He received his doctorate (in Engineering) from the University of Tokyo in 1992 and joined the University faculty in 2009. His research interests and contributions are in the theory of the minimum description length principle, information-theoretic learning theory, and data science applications such as anomaly detection and text mining.
Textul de pe ultima copertă
This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning.
The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints.
Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.
Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.
Caracteristici
Introduces readers to a modern theory of the minimum description length (MDL) principle Includes rich examples of MDL applications to machine learning and data science Written by a pioneer of information-theoretic learning theory