Cantitate/Preț
Produs

Normalization Techniques in Deep Learning: Synthesis Lectures on Computer Vision

Autor Lei Huang
en Limba Engleză Paperback – 10 oct 2023
​This book presents and surveys normalization techniques with a deep analysis in training deep neural networks.  In addition, the author provides technical details in designing new normalization methods and network architectures tailored to specific tasks.  Normalization methods can improve the training stability, optimization efficiency, and generalization ability of deep neural networks (DNNs) and have become basic components in most state-of-the-art DNN architectures.  The author provides guidelines for elaborating, understanding, and applying normalization methods.  This book is ideal for readers working on the development of novel deep learning algorithms and/or their applications to solve practical problems in computer vision and machine learning tasks.  The book also serves as a resource researchers, engineers, and students who are new to the field and need to understand and train DNNs.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 34617 lei  6-8 săpt.
  Springer International Publishing – 10 oct 2023 34617 lei  6-8 săpt.
Hardback (1) 35084 lei  3-5 săpt.
  Springer International Publishing – 9 oct 2022 35084 lei  3-5 săpt.

Din seria Synthesis Lectures on Computer Vision

Preț: 34617 lei

Preț vechi: 43272 lei
-20% Nou

Puncte Express: 519

Preț estimativ în valută:
6624 6969$ 5485£

Carte tipărită la comandă

Livrare economică 14-28 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783031145971
ISBN-10: 3031145976
Ilustrații: XI, 110 p. 26 illus., 21 illus. in color.
Dimensiuni: 168 x 240 mm
Greutate: 0.21 kg
Ediția:1st ed. 2022
Editura: Springer International Publishing
Colecția Springer
Seria Synthesis Lectures on Computer Vision

Locul publicării:Cham, Switzerland

Cuprins

Introduction.- Motivation and Overview of Normalization in DNNs.- A General View of Normalizing Activations.- A Framework for Normalizing Activations as Functions.- Multi-Mode and Combinational Normalization.- BN for More Robust Estimation.- Normalizing Weights.- Normalizing Gradients.- Analysis of Normalization.- Normalization in Task-specific Applications.- Summary and Discussion.

Notă biografică

Lei Huang, Ph.D., is an Associate Professor at Beihang University. His current research interests include normalization techniques involving methods, theories, and applications in training deep neural networks (DNNs).  He also has wide interests in representation and optimization of deep learning theory and computer vision tasks.  Dr. Huang serves as a reviewer for top-tier conferences and journals in machine learning and computer vision.

Textul de pe ultima copertă

This book presents and surveys normalization techniques with a deep analysis in training deep neural networks.  In addition, the author provides technical details in designing new normalization methods and network architectures tailored to specific tasks.  Normalization methods can improve the training stability, optimization efficiency, and generalization ability of deep neural networks (DNNs) and have become basic components in most state-of-the-art DNN architectures.  The author provides guidelines for elaborating, understanding, and applying normalization methods.  This book is ideal for readers working on the development of novel deep learning algorithms and/or their applications to solve practical problems in computer vision and machine learning tasks.  The book also serves as a resource researchers, engineers, and students who are new to the field and need to understand and train DNNs.

Caracteristici

Presents valuable guidelines for selecting normalization techniques for use in training deep neural networks Discusses the research landscape of normalization techniques and covers the needed methods, analysis, and applications Features normalization methods that improve the training stability, optimization efficiency, and generalization of DNNs