Cantitate/Preț
Produs

Handbook of Differential Entropy

Autor Joseph Victor Michalowicz, Jonathan M. Nichols, Frank Bucholtz
en Limba Engleză Hardback – 14 noi 2013
One of the main issues in communications theory is measuring the ultimate data compression possible using the concept of entropy. While differential entropy may seem to be a simple extension of the discrete case, it is a more complex measure that often requires a more careful treatment.
Handbook of Differential Entropy provides a comprehensive introduction to the subject for researchers and students in information theory. Unlike related books, this one brings together background material, derivations, and applications of differential entropy.
The handbook first reviews probability theory as it enables an understanding of the core building block of entropy. The authors then carefully explain the concept of entropy, introducing both discrete and differential entropy. They present detailed derivations of differential entropy for numerous probability models and discuss challenges with interpreting and deriving differential entropy. They also show how differential entropy varies as a function of the model variance.
Focusing on the application of differential entropy in several areas, the book describes common estimators of parametric and nonparametric differential entropy as well as properties of the estimators. It then uses the estimated differential entropy to estimate radar pulse delays when the corrupting noise source is non-Gaussian and to develop measures of coupling between dynamical system components.
Citește tot Restrânge

Preț: 58949 lei

Preț vechi: 85845 lei
-31% Nou

Puncte Express: 884

Preț estimativ în valută:
11282 11719$ 9371£

Carte tipărită la comandă

Livrare economică 03-17 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781466583160
ISBN-10: 1466583169
Pagini: 244
Ilustrații: 89 b/w images and 10 tables
Dimensiuni: 156 x 234 x 18 mm
Greutate: 0.48 kg
Ediția:New.
Editura: CRC Press
Colecția Chapman and Hall/CRC

Cuprins

Probability in Brief. The Concept of Entropy. Entropy for Discrete Probability Distributions. Differential Entropies for Probability Distributions. Differential Entropy as a Function of Variance. Applications of Differential Entropy. Appendices. Bibliography.

Notă biografică

Joseph V. Michalowicz is a consultant with Sotera Defense Solutions. He retired from the U.S. Naval Research Laboratory as head of the Sensor and Data Processing Section in the Optical Sciences Division. He has published extensively in the areas of mathematical modeling, probability and statistics, signal detection, multispectral infrared sensors, and category theory. He received a Ph.D. in mathematics with a minor in electrical engineering from the Catholic University of America.
Jonathan M. Nichols is a member of the Maritime Sensing Section in the Optical Sciences Division at the U.S. Naval Research Laboratory. His research interests include signal and image processing, parameter estimation, and the modeling and analysis of infrared imaging devices. He received a Ph.D. in mechanical engineering from Duke University.
Frank Bucholtz is head of the Advanced Photonics Section at the U.S. Naval Research Laboratory. He has published in the areas of microwave signal processing and microwave photonics, fiber optic sensors, micro-optical devices, nonlinear dynamics and chaos, hyperspectral imaging systems, and information theory. His current research focuses on optical components for digital communications. He received a Ph.D. in physics from Brown University.

Descriere

Unlike related books, this handbook brings together background material, derivations, and applications of differential entropy. The book first reviews probability theory as it enables an understanding of the core building block of entropy. The authors then carefully explain both discrete and differential entropy. They present detailed derivations of differential entropy for numerous probability models, discuss challenges with interpreting and deriving differential entropy, and show how differential entropy varies as a function of the model variance. They also explore the application of differential entropy in several areas.