Cantitate/Preț
Produs

Information Theory and Statistical Learning

Editat de Frank Emmert-Streib, Matthias Dehmer
en Limba Engleză Paperback – 4 noi 2010
"Information Theory and Statistical Learning" presents theoretical and practical results about information theoretic methods used in the context of statistical learning.
The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines.
Advance Praise for "Information Theory and Statistical Learning":
"A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places." Shun-ichi Amari, RIKEN Brain Science Institute, Professor-Emeritus at the University of Tokyo
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 62936 lei  6-8 săpt.
  Springer Us – 4 noi 2010 62936 lei  6-8 săpt.
Hardback (1) 63570 lei  6-8 săpt.
  Springer Us – 14 noi 2008 63570 lei  6-8 săpt.

Preț: 62936 lei

Preț vechi: 74043 lei
-15% Nou

Puncte Express: 944

Preț estimativ în valută:
12045 12707$ 10038£

Carte tipărită la comandă

Livrare economică 03-17 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781441946508
ISBN-10: 1441946500
Pagini: 452
Ilustrații: X, 439 p.
Dimensiuni: 155 x 235 x 24 mm
Greutate: 0.63 kg
Ediția:Softcover reprint of hardcover 1st ed. 2009
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States

Public țintă

Research

Cuprins

Algorithmic Probability: Theory and Applications.- Model Selection and Testing by the MDL Principle.- Normalized Information Distance.- The Application of Data Compression-Based Distances to Biological Sequences.- MIC: Mutual Information Based Hierarchical Clustering.- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information.- Information Approach to Blind Source Separation and Deconvolution.- Causality in Time Series: Its Detection and Quantification by Means of Information Theory.- Information Theoretic Learning and Kernel Methods.- Information-Theoretic Causal Power.- Information Flows in Complex Networks.- Models of Information Processing in the Sensorimotor Loop.- Information Divergence Geometry and the Application to Statistical Machine Learning.- Model Selection and Information Criterion.- Extreme Physical Information as a Principle of Universal Stability.- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.

Textul de pe ultima copertă

Information Theory and Statistical Learning presents theoretical and practical results about information theoretic methods used in the context of statistical learning.
The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines.
Advance Praise for Information Theory and Statistical Learning:
"A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places."
-- Shun-ichi Amari, RIKEN Brain Science Institute,  Professor-Emeritus at the University of Tokyo

Caracteristici

Combines information theory and statistical learning components in one volume Many chapters are contributed by authors that pioneered the presented methods themselves Interdisciplinary approach makes this book accessible to researchers and professionals in many areas of study