Cantitate/Preț
Produs

Information Theoretic Principles for Agent Learning: Synthesis Lectures on Engineering, Science, and Technology

Autor Jerry D. Gibson
en Limba Engleză Hardback – 25 sep 2024
This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.
Citește tot Restrânge

Din seria Synthesis Lectures on Engineering, Science, and Technology

Preț: 38668 lei

Nou

Puncte Express: 580

Preț estimativ în valută:
73100 7682$ 6170£

Carte disponibilă

Livrare economică 01-15 martie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783031653872
ISBN-10: 3031653874
Pagini: 150
Ilustrații: Approx. 150 p.
Dimensiuni: 168 x 240 mm
Greutate: 0.34 kg
Ediția:2024
Editura: Springer Nature Switzerland
Colecția Springer
Seria Synthesis Lectures on Engineering, Science, and Technology

Locul publicării:Cham, Switzerland

Cuprins

Background and Overview.- Entropy and Mutual Information.- Differential Entropy, Entropy Rate, and Maximum Entropy.- Typical Sequences and The AEP.- Markov Chains and Cascaded Systems.- Hypothesis Testing, Estimation, Information, and Sufficient Statistics.- Information Theoretic Quantities and Learning.- Estimation and Entropy Power.- Time Series Analyses.- Information Bottleneck Principle.- Channel Capacity.- Rate Distortion Theory.

Notă biografică

Jerry D. Gibson is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He has been an Associate Editor of the IEEE Transactions on Communications and the IEEE Transactions on Information Theory. He was an IEEE Communications Society Distinguished Lecturer for 2007-2008. He is an IEEE Fellow, and he has received The Fredrick Emmons Terman Award (1990), the 1993 IEEE Signal Processing Society Senior Paper Award, the 2009 IEEE Technical Committee on Wireless Communications Recognition Award, and the 2010 Best Paper Award from the IEEE Transactions on Multimedia. He is the author, coauthor, and editor of several books, the most recent of which are The Mobile Communications Handbook (Editor, 3rd ed., 2012), Rate Distortion Bounds for Voice and Video (Coauthor with Jing Hu, NOW Publishers, 2014), and Information Theory and Rate Distortion Theory for Communications and Compression (Morgan-Claypool, 2014). His research interests are lossy source coding, wireless communications and networks, and digital signal processing.

Textul de pe ultima copertă

This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.
In addition, this book:
  • Describes the fundamentals of information theoretic techniques for statistical data science analyses
  • Provides succinct introductions to key topics, with references as needed for further technical depth
  • Enables readers from varying backgrounds to understand the behavior and performance of a learning agent

Caracteristici

Describes the fundamentals of information theoretic techniques for statistical data science analyses Provides succinct introductions to key topics, with references as needed for further technical depth Enables readers from varying backgrounds to understand the behavior and performance of a learning agent