Theory of Information and its Value
Autor Ruslan L. Stratonovich Editat de Roman V. Belavkin, Panos M. Pardalos, Jose C. Principeen Limba Engleză Hardback – 15 ian 2020
Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 737.55 lei 6-8 săpt. | |
Springer International Publishing – 26 aug 2021 | 737.55 lei 6-8 săpt. | |
Hardback (1) | 743.74 lei 6-8 săpt. | |
Springer International Publishing – 15 ian 2020 | 743.74 lei 6-8 săpt. |
Preț: 743.74 lei
Preț vechi: 929.67 lei
-20% Nou
Puncte Express: 1116
Preț estimativ în valută:
142.34€ • 150.16$ • 118.62£
142.34€ • 150.16$ • 118.62£
Carte tipărită la comandă
Livrare economică 02-16 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783030228323
ISBN-10: 3030228320
Pagini: 415
Ilustrații: XXII, 419 p. 33 illus., 4 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.79 kg
Ediția:1st ed. 2020
Editura: Springer International Publishing
Colecția Springer
Locul publicării:Cham, Switzerland
ISBN-10: 3030228320
Pagini: 415
Ilustrații: XXII, 419 p. 33 illus., 4 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.79 kg
Ediția:1st ed. 2020
Editura: Springer International Publishing
Colecția Springer
Locul publicării:Cham, Switzerland
Cuprins
Foreword.- Preface.- 1 Definition of information and entropy in the absence of noise- 2 Encoding of discrete information in the absence of noise and penalties.- 3 Encoding in the presence of penalties. The first variational problem- 4 The first asymptotic theorem and relative results.- 5 Computation of entropy for special cases. Entropy of stochastic processes.- 6 Information in the presence of noise. The Shannon's amount of information.- 7 Message transmission in the presence of noise. The second asymptotic theorem and its various formulations.- 8 Channel capacity. Important particular cases of channels.- 9 Definition of the value of information.- 10 The value of Shannon information for the most important Bayesian systems.- 11 Asymptotical results related to the value of information. The Third asymptotic theorem.- 12 Information theory and the second law of thermodynamics.- Appendix Some matrix (operator) identities.- Index.
Recenzii
“The book could be useful in advanced graduate courses with students, who are not afraid of integrals and probabilities.” (Jaak Henno, zbMATH 1454.94002, 2021)
Notă biografică
Textul de pe ultima copertă
This English version of Ruslan L. Stratonovich’s Theory of Information (1975) builds on theory and provides methods, techniques, and concepts toward utilizing critical applications. Unifying theories of information, optimization, and statistical physics, the value of information theory has gained recognition in data science, machine learning, and artificial intelligence. With the emergence of a data-driven economy, progress in machine learning, artificial intelligence algorithms, and increased computational resources, the need for comprehending information is essential. This book is even more relevant today than when it was first published in 1975. It extends the classic work of R.L. Stratonovich, one of the original developers of the symmetrized version of stochastic calculus and filtering theory, to name just two topics.
Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.
Each chapter begins with basic, fundamental ideas, supported by clear examples; the material then advances to great detail and depth. The reader is not required to be familiar with the more difficult and specific material. Rather, the treasure trove of examples of stochastic processes and problems makes this book accessible to a wide readership of researchers, postgraduates, and undergraduate students in mathematics, engineering, physics and computer science who are specializing in information theory, data analysis, or machine learning.
Caracteristici
Broadens understanding of information theory and the value of information English translation of Rouslan L. Stratonovich’s original "Theory of Information" Unifies theories of information, optimization, and statistical physics Supplies opportunities to practice techniques through unique examples