Cantitate/Preț
Produs

Information-Theoretic Methods in Data Science

Editat de Miguel R. D. Rodrigues, Yonina C. Eldar
en Limba Engleză Hardback – 7 apr 2021
Learn about the state-of-the-art at the interface between information theory and data science with this first unified treatment of the subject. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information-theoretic methods are being used in data acquisition, data representation, data analysis, and statistics and machine learning. Coverage is broad, with chapters on signal acquisition, data compression, compressive sensing, data communication, representation learning, emerging topics in statistics, and much more. Each chapter includes a topic overview, definition of the key problems, emerging and open problems, and an extensive reference list, allowing readers to develop in-depth knowledge and understanding. Providing a thorough survey of the current research area and cutting-edge trends, this is essential reading for graduate students and researchers working in information theory, signal processing, machine learning, and statistics.
Citește tot Restrânge

Preț: 65024 lei

Preț vechi: 73061 lei
-11% Nou

Puncte Express: 975

Preț estimativ în valută:
12444 12926$ 10337£

Carte disponibilă

Livrare economică 13-27 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781108427135
ISBN-10: 1108427138
Pagini: 560
Ilustrații: 74 b/w illus.
Dimensiuni: 176 x 250 x 34 mm
Greutate: 1.04 kg
Editura: Cambridge University Press
Colecția Cambridge University Press
Locul publicării:Cambridge, United Kingdom

Cuprins

1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar; 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor; 4. Information-theoretic bounds on sketching Mert Pillanci; 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister; 8. Computing choice: learning distributions over permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav Varshney; 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega; 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu; 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi and Muriel Médard; 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.

Descriere

The first unified treatment of the interface between information theory and emerging topics in data science.