Information Theory: Three Theorems by Claude Shannon: UNITEXT, cartea 144
Autor Antoine Chambert-Loiren Limba Engleză Paperback – 16 mar 2023
The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory.
Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
Din seria UNITEXT
- 13% Preț: 430.22 lei
- Preț: 403.81 lei
- Preț: 298.05 lei
- Preț: 433.40 lei
- 20% Preț: 571.53 lei
- Preț: 338.22 lei
- 17% Preț: 364.74 lei
- Preț: 351.46 lei
- 18% Preț: 232.69 lei
- Preț: 261.21 lei
- Preț: 358.92 lei
- Preț: 221.26 lei
- Preț: 266.77 lei
- Preț: 361.11 lei
- Preț: 379.19 lei
- Preț: 442.62 lei
- Preț: 441.36 lei
- 17% Preț: 363.84 lei
- 15% Preț: 539.52 lei
- Preț: 316.73 lei
- Preț: 335.23 lei
- Preț: 339.57 lei
- 17% Preț: 402.50 lei
- Preț: 232.94 lei
- 15% Preț: 759.42 lei
- 17% Preț: 361.96 lei
- Preț: 531.05 lei
- Preț: 399.04 lei
- Preț: 221.87 lei
- Preț: 443.56 lei
- Preț: 434.32 lei
- Preț: 253.26 lei
- Preț: 192.41 lei
- Preț: 365.22 lei
- Preț: 479.21 lei
- Preț: 231.11 lei
- Preț: 367.64 lei
- Preț: 314.49 lei
- Preț: 432.57 lei
- Preț: 313.02 lei
- Preț: 259.35 lei
- Preț: 477.66 lei
- Preț: 317.47 lei
- Preț: 240.38 lei
- Preț: 318.94 lei
- Preț: 126.19 lei
- Preț: 471.05 lei
- Preț: 318.94 lei
- Preț: 304.86 lei
- Preț: 266.77 lei
Preț: 389.45 lei
Preț vechi: 486.80 lei
-20% Nou
Puncte Express: 584
Preț estimativ în valută:
74.54€ • 78.63$ • 62.11£
74.54€ • 78.63$ • 62.11£
Carte tipărită la comandă
Livrare economică 30 decembrie 24 - 04 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783031215605
ISBN-10: 3031215605
Pagini: 209
Ilustrații: XII, 209 p. 1 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.43 kg
Ediția:1st ed. 2022
Editura: Springer International Publishing
Colecția Springer
Seriile UNITEXT, La Matematica per il 3+2
Locul publicării:Cham, Switzerland
ISBN-10: 3031215605
Pagini: 209
Ilustrații: XII, 209 p. 1 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.43 kg
Ediția:1st ed. 2022
Editura: Springer International Publishing
Colecția Springer
Seriile UNITEXT, La Matematica per il 3+2
Locul publicării:Cham, Switzerland
Cuprins
Elements of Theory of Probability.- Entropy and Mutual Information.- Coding.- Sampling.- Solutions to Exercises.- Bibliography.- Notation.- Index.
Recenzii
“This book can be especially useful for those who are just getting to know the basics of information theory.” (Eszter Gselmann, zbMATH 1526.94001, 2024)
Notă biografică
Antoine Chambert-Loir is a professor of mathematics at Université Paris Cité. His research addresses questions in algebraic geometry which are motivated by number theoretical problems. He is the author of two books published by Springer-Verlag: A Field Guide To Algebra, an introduction to Galois theory; and (Mostly) Commutative Algebra, an intermediate-level exposition of commutative algebra. With J. Nicaise and J. Sebag, he cowrote the research monograph Motivic Integration (published by Birkhäuser), which was awarded the 2017 Ferran Sunyer i Balaguer prize.
Textul de pe ultima copertă
This book provides an introduction to information theory, focussing on Shannon’s three foundational theorems of 1948–1949. Shannon’s first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book.
The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory.
Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
Caracteristici
Provides an introduction to information theory, fundamental to the digitized world Accessible to mathematics and computer science undergraduates Includes numerous exercises, with solutions