Cantitate/Preț
Produs

Information-Theoretic Foundations of Mismatched Decoding

Autor Jonathan Scarlett, Albert Guillén I Fàbregas, Anelia Somekh-Baruch
en Limba Engleză Paperback – 31 aug 2020
Mismatched decoding has long been studied and used when considering practical considerations such as channel uncertainty and implementation constraints that rule out the use of an optimal decoder in reliably transmitting over a communication channel. This problem is not only of direct interest in its own right but also has close connections with other long-standing theoretical problems in information theory. In this monograph, the authors survey both classical literature and recent developments on the mismatched decoding problem with an emphasis on achievable random-coding rates for memoryless channels. In doing so, they present two widely-considered achievable rates known as the generalized mutual information (GMI) and the LM rate, and overview their derivations and properties. The authors bring the reader up to date by including discussion of several improved rates via multi-user coding techniques, as well as recent developments and challenges in establishing upper bounds on the mismatch capacity, and an analogous mismatched encoding problem in rate-distortion theory. This monograph is aimed at students, researchers and practitioners in information theory and communications. It provides a thorough and clear survey of the topic and highlights a variety of applications and connections with other prominent information theory problems.
Citește tot Restrânge

Preț: 55755 lei

Preț vechi: 60603 lei
-8% Nou

Puncte Express: 836

Preț estimativ în valută:
10669 11167$ 8880£

Carte tipărită la comandă

Livrare economică 31 martie-14 aprilie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781680837124
ISBN-10: 1680837125
Pagini: 274
Dimensiuni: 156 x 234 x 15 mm
Greutate: 0.39 kg
Editura: Now Publishers Inc

Descriere

Surveys both classical literature and recent developments on the mismatched decoding problem, with an emphasis on achievable random-coding rates for memoryless channels. In doing so they present two widely-considered achievable rates known as the generalized mutual information and the LM rate, and overview their derivations and properties.