Estimation of Mutual Information: Behaviormetrics: Quantitative Approaches to Human Behavior, cartea 25
Autor Joe Suzukien Limba Engleză Hardback – 11 feb 2025
Din seria Behaviormetrics: Quantitative Approaches to Human Behavior
- 18% Preț: 765.02 lei
- 18% Preț: 934.33 lei
- 18% Preț: 760.24 lei
- 15% Preț: 620.21 lei
- 18% Preț: 761.48 lei
- 18% Preț: 886.27 lei
- 15% Preț: 678.24 lei
- 15% Preț: 674.24 lei
- 18% Preț: 874.50 lei
- 18% Preț: 1101.98 lei
- 18% Preț: 874.82 lei
- 18% Preț: 924.30 lei
- 18% Preț: 1093.93 lei
- 18% Preț: 774.29 lei
- 18% Preț: 757.55 lei
- 18% Preț: 921.20 lei
- 18% Preț: 870.01 lei
- 18% Preț: 704.11 lei
- 26% Preț: 524.35 lei
Preț: 405.88 lei
Preț vechi: 551.30 lei
-26% Nou
Puncte Express: 609
Preț estimativ în valută:
77.68€ • 80.75$ • 64.35£
77.68€ • 80.75$ • 64.35£
Carte nepublicată încă
Doresc să fiu notificat când acest titlu va fi disponibil:
Se trimite...
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9789811307331
ISBN-10: 9811307334
Pagini: 120
Ilustrații: X, 120 p. 60 illus., 20 illus. in color.
Dimensiuni: 155 x 235 mm
Ediția:1st ed. 2024
Editura: Springer Nature Singapore
Colecția Springer
Seria Behaviormetrics: Quantitative Approaches to Human Behavior
Locul publicării:Singapore, Singapore
ISBN-10: 9811307334
Pagini: 120
Ilustrații: X, 120 p. 60 illus., 20 illus. in color.
Dimensiuni: 155 x 235 mm
Ediția:1st ed. 2024
Editura: Springer Nature Singapore
Colecția Springer
Seria Behaviormetrics: Quantitative Approaches to Human Behavior
Locul publicării:Singapore, Singapore
Cuprins
Chapter 1 Introduction.- Chapter 2 Estimation of Mutual Information for Discrete Variables.- Chapter 3 Estimation of Mutual Information for Continuous Variables.- Chapter 4 Estimation of Mutual Information for High-dimensional Variables.- Chapter 5 Application to Causal Discovery: Lingam and ICA.- Chapter 6 Concluding Remarks.
Notă biografică
Joe Suzuki, Osaka University
Textul de pe ultima copertă
This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.
Caracteristici
Provides a developed theory that that is unique and specific rather than standard and average, and describes several cases such as discrete and continuous in a unified manner Contains the whole proofs but chooses the most simple and comprehensive ones Includes R codes and R packages (BNSL) for understanding the theory