Cantitate/Preț
Produs

Entropy and Information Theory

Autor Robert M. Gray
en Limba Engleză Hardback – 3 feb 2011
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.
New in this edition:
  • Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes
  • Expanded discussion of results from ergodic theory relevant to information theory
  • Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources
  • New material on trading off information and distortion, including the Marton inequality
  • New material on the properties of optimal and asymptotically optimal source codes
  • New material on the relationships of source coding and rate-constrained simulation or modeling of random processes
Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 105567 lei  6-8 săpt.
  Springer Us – 18 sep 2014 105567 lei  6-8 săpt.
Hardback (1) 106150 lei  6-8 săpt.
  Springer Us – 3 feb 2011 106150 lei  6-8 săpt.

Preț: 106150 lei

Preț vechi: 129450 lei
-18% Nou

Puncte Express: 1592

Preț estimativ în valută:
20328 21999$ 16946£

Carte tipărită la comandă

Livrare economică 09-23 decembrie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781441979698
ISBN-10: 1441979697
Pagini: 385
Ilustrații: XXVII, 409 p.
Dimensiuni: 155 x 235 x 32 mm
Greutate: 0.79 kg
Ediția:2nd ed. 2011
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States

Public țintă

Graduate

Cuprins

Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Index

Recenzii

From the book reviews:
“This book is the second edition of the classic 1990 text … and inherits much of the structure and all of the virtues of the original. … this is a deep and important book, which would reward further study as the focus of a reading group or graduate course, and comes enthusiastically recommended.” (Oliver Johnson, Mathematical Reviews, October, 2014)
“In Entropy and Information Theory Robert Gray offers an excellent text to stimulate research in this field. … Entropy and Information Theory is highly recommended as essential reading to academics and researchers in the field, especially to engineers interested in the mathematical aspects and mathematicians interested in the engineering applications. … it will contribute to further synergy between the two fields and the deepening of research efforts.” (Ina Fourie, Online Information Review, Vol. 36 (3), 2012)
“The book offers interesting and very important information about the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The main goal is a general development of Shannon’s mathematical theory of communication for single-user systems. … The author manages to balance the practice with the theory, every chapter is very well structured and has high-value content.” (Nicolae Constantinescu, Zentralblatt MATH, Vol. 1216, 2011)

Notă biografică

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.

Caracteristici

New edition of classic text Important engineering applications of performance bounds and code design for communication systems Distinguished author Includes supplementary material: sn.pub/extras