Information Measures: Information and its Description in Science and Engineering: Signals and Communication Technology
Autor Christoph Arndten Limba Engleză Paperback – 5 noi 2003
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 700.36 lei 6-8 săpt. | |
Springer Berlin, Heidelberg – 5 noi 2003 | 700.36 lei 6-8 săpt. | |
Hardback (1) | 947.51 lei 6-8 săpt. | |
Springer Berlin, Heidelberg – 27 mar 2001 | 947.51 lei 6-8 săpt. |
Din seria Signals and Communication Technology
- 20% Preț: 338.91 lei
- 18% Preț: 1533.44 lei
- 15% Preț: 576.05 lei
- 15% Preț: 575.41 lei
- 18% Preț: 832.82 lei
- 17% Preț: 361.80 lei
- Preț: 386.77 lei
- 20% Preț: 1002.72 lei
- 15% Preț: 630.91 lei
- 18% Preț: 936.81 lei
- 18% Preț: 1552.97 lei
- 18% Preț: 797.94 lei
- 18% Preț: 715.71 lei
- 18% Preț: 880.95 lei
- 18% Preț: 890.25 lei
- 15% Preț: 635.55 lei
- 20% Preț: 651.32 lei
- 15% Preț: 630.41 lei
- 18% Preț: 1816.58 lei
- 18% Preț: 930.28 lei
- 18% Preț: 1220.14 lei
- 20% Preț: 638.47 lei
- 20% Preț: 976.52 lei
- 18% Preț: 924.71 lei
- 18% Preț: 929.67 lei
- 18% Preț: 1365.82 lei
- 20% Preț: 986.56 lei
- 18% Preț: 937.43 lei
- 20% Preț: 984.28 lei
- 18% Preț: 1391.45 lei
- 15% Preț: 642.33 lei
- 15% Preț: 634.74 lei
- 15% Preț: 626.55 lei
- 18% Preț: 932.78 lei
- 18% Preț: 822.24 lei
- 18% Preț: 1210.05 lei
- 15% Preț: 624.31 lei
- 15% Preț: 628.49 lei
- 15% Preț: 635.42 lei
- 15% Preț: 628.49 lei
- 20% Preț: 989.14 lei
- 18% Preț: 954.19 lei
- 15% Preț: 689.77 lei
Preț: 700.36 lei
Preț vechi: 823.96 lei
-15% Nou
Puncte Express: 1051
Preț estimativ în valută:
134.03€ • 139.64$ • 111.44£
134.03€ • 139.64$ • 111.44£
Carte tipărită la comandă
Livrare economică 10-24 februarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783540408550
ISBN-10: 354040855X
Pagini: 572
Ilustrații: XIX, 548 p.
Dimensiuni: 155 x 235 x 30 mm
Greutate: 0.85 kg
Ediția:Softcover reprint of the original 1st ed. 2001
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seria Signals and Communication Technology
Locul publicării:Berlin, Heidelberg, Germany
ISBN-10: 354040855X
Pagini: 572
Ilustrații: XIX, 548 p.
Dimensiuni: 155 x 235 x 30 mm
Greutate: 0.85 kg
Ediția:Softcover reprint of the original 1st ed. 2001
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seria Signals and Communication Technology
Locul publicării:Berlin, Heidelberg, Germany
Public țintă
ResearchCuprins
Abstract.- Structure and Structuring.- 1 Introduction.- Science and information.- Man as control loop.- Information, complexity and typical sequences.- Concepts of information.- Information, its technical dimension and the meaning of a message.- Information as a central concept.- 2 Basic considerations.- 2.1 Formal derivation of information.- 2.2 Application of the information measure (Shannon’s information).- 2.3 The law of Weber and Fechner.- 2.4 Information of discrete random variables.- 3 Historic development of information theory.- 3.1 Development of information transmission.- 3.2 Development of information functions.- 4 The concept of entropy in physics.- The laws of thermodynamics:.- 4.1 Macroscopic entropy.- 4.2 Statistical entropy.- 4.3 Dynamic entropy.- 5 Extension of Shannon’s information.- 5.1 Rényi’s Information 1960.- 5.2 Another generalized entropy (logical expansion).- 5.3 Gain of information via conditional probabilities.- 5.4 Other entropy or information measures.- 6 Generalized entropy measures.- 6.1 The corresponding measures of divergence.- 6.2 Weighted entropies and expectation values of entropies.- 7 Information functions and gaussian distributions.- 7.1 Rényi’s information of a gaussian distributed random variable.- 7.2 Shannon’s information.- 8 Shannon’s information of discrete probability distributions.- 8.1 Continuous and discrete random variables.- 8.2 Shannon’s information of a gaussian distribution.- 8.3 Shannon’s information as the possible gain of information in an observation.- 8.4 Limits of the information, limitations of the resolution.- 8.5 Maximization of the entropy of a continuous random variable.- 9 Information functions for gaussian distributions part II.- 9.1 Kullback’s information.- 9.2 Kullback’sdivergence.- 9.3 Kolmogorov’s information.- 9.4 Transformation of the coordinate system and the effects on the information.- 9.5 Transformation, discrete and continuous measures of entropy.- 9.6 Summary of the information functions.- 10 Bounds of the variance.- 10.1 Cramér-Rao bound.- 10.2 Chapman-Robbins bound.- 10.3 Bhattacharrya bound.- 10.4 Barankin bound.- 10.5 Other bounds.- 10.6 Summary.- 10.7 Biased estimator.- 11 Ambiguity function.- 11.1 The ambiguity function and Kullback’s information.- 11.2 Connection between ambiguity function and Fisher’s information.- 11.3 Maximum likelihood estimation and the ambiguity function.- 11.4 The ML estimation is asymptotically efficient.- 11.5 Transition to the Akaike information criterion.- 12 Akaike’s information criterion.- 12.1 Akaike’s information criterion and regression.- 12.2 BIC, SC or HQ.- 13 Channel information.- 13.1 Redundancy.- 13.2 Rate of transmission and equivocation.- 13.3 Hadamard’s inequality and Gibbs’s second theorem.- 13.4 Kolmogorov’s information.- 13.5 Kullbacks divergence.- 13.6 An example of a transmission.- 13.7 Communication channel and information processing.- 13.8 Shannon’s bound.- 13.9 Example of the channel capacity.- 14 ‘Deterministic’ and stochastic information.- 14.1 Information in state space models.- 14.2 The observation equation.- 14.3 Transmission faster than light.- 14.4 Information about state space variables.- 15 Maximum entropy estimation.- 15.1 The difference between maximum entropy and minimum variance.- 15.2 The difference from bootstrap or resampling methods.- 15.3 A maximum entropy example.- 15.4 Maximum entropy: The method.- 15.5 Maximum entropy and minimum discrimination information.- 15.6 Generation of generalized entropy measures.- 16 Concludingremarks.- 16.1 Information, entropy and self-organization.- 16.2 Complexity theory.- 16.3 Data reduction.- 16.4 Cryptology.- 16.5 Concluding considerations.- 16.6 Information.- A.1 Inequality for Kullback’s information.- A.2 The log-sum inequality.- A.3 Generalized entropy, divergence and distance measures.- A.3.1 Entropy measures.- A.3.2 Generalized measures of distance.- A.3.3 Generalized measures of the directed divergence.- A.3.4 Generalized measures of divergence.- A.3.4.1 Information radius and the J-divergence.- A.3.4.2 Generalization of the R-divergence.- A.3.4.3 Generalization of the J-divergence.- A.4 A short introduction to probability theory.- A.4.1 Axiomatic definition of probability.- A.4.1.1 Events, elementary events, sample space.- A.4.1.2 Classes of subsets, fields.- A.4.1.3 Axiomatic definition of probability according to Kolmogorov.- Probability space.- A.4.1.4 Random variables.- A.4.1.5 Probability distribution.- A.4.1.6 Probability space, sample space, realization space.- A.4.1.7 Probability distribution and distribution density function.- A.4.1.8 Probability distribution density function (PDF).- A.5 The regularity conditions.- A.6 State space description.
Recenzii
"Bioinformaticians are facing the challenge of how to handle immense amounts of raw data, such as are generated from genome mapping, make sense of them, and render them accessible to scientists working on a wide variety of problems. "Information Measures: Information and its Description in Science and Engineering" can be such a tool."
IEEE Engineering in Medicine and Biology
IEEE Engineering in Medicine and Biology
Textul de pe ultima copertă
This book is an introduction to the mathematical description of information in science and engineering. The necessary mathematical theory will be treated in a more vivid way than in the usual theorem-proof structure. This enables the reader to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Some examples of the information measures examined are: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neu-rons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. This softcover edition addresses researchers and students in electrical engineering, particularly in control and communications, physics, and applied mathematics.
Caracteristici
As a survey of information measures this book can serve as an introduction to information theory Written for non-mathematicians, i.e. engineers and scientists Relevant for many established and novel applications in Science, Communications, Computers and Control, in particular in Bioinformatics Connects the concepts of information in the different disciplines