New Foundations for Information Theory: Logical Entropy and Shannon Entropy: SpringerBriefs in Philosophy
Autor David Ellermanen Limba Engleză Paperback – 31 oct 2021
Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained.
The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement.
Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.
Din seria SpringerBriefs in Philosophy
- Preț: 335.81 lei
- 15% Preț: 450.43 lei
- Preț: 367.77 lei
- Preț: 367.36 lei
- Preț: 348.77 lei
- Preț: 365.87 lei
- Preț: 399.81 lei
- Preț: 367.00 lei
- Preț: 370.01 lei
- Preț: 366.08 lei
- Preț: 365.87 lei
- Preț: 428.32 lei
- Preț: 364.77 lei
- Preț: 332.17 lei
- Preț: 431.49 lei
- Preț: 366.08 lei
- Preț: 367.00 lei
- Preț: 396.44 lei
- Preț: 364.40 lei
- Preț: 366.63 lei
- Preț: 468.05 lei
- 15% Preț: 450.61 lei
- Preț: 401.88 lei
- Preț: 467.31 lei
- Preț: 365.51 lei
- Preț: 368.89 lei
- Preț: 366.08 lei
- Preț: 349.09 lei
- Preț: 348.77 lei
- Preț: 366.46 lei
- Preț: 132.39 lei
- Preț: 369.43 lei
- Preț: 369.43 lei
- Preț: 462.24 lei
- Preț: 367.36 lei
- Preț: 367.56 lei
- Preț: 368.89 lei
- Preț: 333.27 lei
- Preț: 367.00 lei
- 20% Preț: 312.36 lei
- Preț: 333.57 lei
- Preț: 464.30 lei
- Preț: 366.83 lei
- Preț: 366.08 lei
- Preț: 462.24 lei
- Preț: 395.89 lei
- 15% Preț: 448.54 lei
- Preț: 335.81 lei
- Preț: 369.05 lei
- Preț: 366.08 lei
Preț: 464.51 lei
Nou
Puncte Express: 697
Preț estimativ în valută:
88.91€ • 92.66$ • 74.01£
88.91€ • 92.66$ • 74.01£
Carte tipărită la comandă
Livrare economică 07-21 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783030865511
ISBN-10: 3030865517
Pagini: 113
Ilustrații: XIII, 113 p. 24 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.19 kg
Ediția:1st ed. 2021
Editura: Springer International Publishing
Colecția Springer
Seria SpringerBriefs in Philosophy
Locul publicării:Cham, Switzerland
ISBN-10: 3030865517
Pagini: 113
Ilustrații: XIII, 113 p. 24 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.19 kg
Ediția:1st ed. 2021
Editura: Springer International Publishing
Colecția Springer
Seria SpringerBriefs in Philosophy
Locul publicării:Cham, Switzerland
Cuprins
- Logical entropy.- The relationship between logical entropy and Shannon entropy.- The compound notions for logical and Shannon entropies.- Further developments of logical entropy.- Logical Quantum Information Theory.- Conclusion.- Appendix: Introduction to the logic of partitions.
Notă biografică
David Ellerman is an Associate Researcher at the Faculty of Social Sciences, University of Ljubljana, Slovenia. In 2003 he retired to academia after 10 years at the World Bank where he was the economic advisor and speech-writer for the Chief Economist Joseph Stiglitz. In his prior university teaching, Ellerman taught over a twenty-year period in the Boston area in five disciplines: economics, mathematics, computer science, operations research, and accounting. He was educated at Massachusetts Institute of Technology (USA), and at Boston University where he has two Master’s degrees, one in Philosophy and one in Economics, and a doctorate in Mathematics. Ellerman has published eight books and many articles in scholarly journals in economics, logic, mathematics, physics, philosophy, and law.
Caracteristici
Introduces a new foundation for information theory, based on logical entropy & its transform into Shannon entropy Offers a new maximizing logical entropy approach to the MaxEntropy method Presents a new logical entropy approach to quantum information theory