Cantitate/Preț
Produs

Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems: Studies in Computational Intelligence, cartea 1100

Editat de Witold Pedrycz, Shyi-Ming Chen
en Limba Engleză Paperback – 15 iun 2024
The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 115362 lei  38-45 zile
  Springer International Publishing – 15 iun 2024 115362 lei  38-45 zile
Hardback (1) 125370 lei  6-8 săpt.
  Springer International Publishing – 14 iun 2023 125370 lei  6-8 săpt.

Din seria Studies in Computational Intelligence

Preț: 115362 lei

Preț vechi: 144203 lei
-20% Nou

Puncte Express: 1730

Preț estimativ în valută:
22088 23000$ 18326£

Carte tipărită la comandă

Livrare economică 10-17 februarie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783031320972
ISBN-10: 3031320972
Pagini: 232
Ilustrații: VIII, 232 p. 70 illus., 51 illus. in color.
Dimensiuni: 155 x 235 mm
Ediția:2023
Editura: Springer International Publishing
Colecția Springer
Seria Studies in Computational Intelligence

Locul publicării:Cham, Switzerland

Cuprins

Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation.- A Geometric Perspective on Feature-Based Distillation.- Knowledge Distillation Across Vision and Language.- Knowledge Distillation in Granular Fuzzy Models by Solving Fuzzy Relation Equations.- Ensemble Knowledge Distillation for Edge Intelligence in Medical Applications.- Self-Distillation with the New Paradigm in Multi-Task Learning.- Knowledge Distillation for Autonomous Intelligent Unmanned System.

Textul de pe ultima copertă

The book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.

Caracteristici

Comprehensive and up-to-date treatise of knowledge distillation cast in a general framework of transfer learning Focuses on a spectrum of methodological and algorithmic issues Includes recent developments in vision and language learning and relational architectures