Cantitate/Preț
Produs

Statistical Mechanics of Neural Networks

Autor Haiping Huang
en Limba Engleză Paperback – 6 ian 2023
This book highlights a comprehensive introduction to the fundamental statistical mechanics underneath the inner workings of neural networks. The book discusses in details important concepts and techniques including the cavity method, the mean-field theory, replica techniques, the Nishimori condition, variational methods, the dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, the chaos theory of recurrent neural networks, and eigen-spectrums of neural networks, walking new learners through the theories and must-have skillsets to understand and use neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions. It is a good reference for students, researchers, and practitioners in the area of neural networks.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 94690 lei  6-8 săpt.
  Springer Nature Singapore – 6 ian 2023 94690 lei  6-8 săpt.
Hardback (1) 95147 lei  6-8 săpt.
  Springer Nature Singapore – 5 ian 2022 95147 lei  6-8 săpt.

Preț: 94690 lei

Preț vechi: 115476 lei
-18% Nou

Puncte Express: 1420

Preț estimativ în valută:
18120 18721$ 15073£

Carte tipărită la comandă

Livrare economică 19 martie-02 aprilie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9789811675720
ISBN-10: 9811675724
Pagini: 296
Ilustrații: XVIII, 296 p. 62 illus., 40 illus. in color.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.49 kg
Ediția:1st ed. 2021
Editura: Springer Nature Singapore
Colecția Springer
Locul publicării:Singapore, Singapore

Cuprins

Chapter 1:  Introduction Chapter 2:  Spin Glass Models and Cavity Method
Chapter 3:  Variational Mean-Field Theory and Belief Propagation
Chapter 4:  Monte-Carlo Simulation Methods
Chapter 5:  High-Temperature Expansion Techniques
Chapter 6: Nishimori Model
Chapter 7: Random Energy Model
Chapter 8:  Statistical Mechanics of Hopfield Model
Chapter 9:  Replica Symmetry and Symmetry Breaking
Chapter 10: Statistical Mechanics of Restricted Boltzmann Machine
Chapter 11: Simplest Model of Unsupervised Learning with Binary Synapses
Chapter 12: Inherent-Symmetry Breaking in Unsupervised Learning
Chapter 13: Mean-Field Theory of Ising Perceptron
Chapter 14: Mean-Field Model of Multi-Layered Perceptron
Chapter 15: Mean-Field Theory of Dimension Reduction in Neural Networks
Chapter 16: Chaos Theory of Random Recurrent Networks
Chapter 17: Statistical Mechanics of Random Matrices
Chapter 18: Perspectives

Notă biografică

Haiping Huang
Dr. Haiping Huang received his Ph.D. degree in theoretical physics from the Institute of Theoretical Physics, the Chinese Academy of Sciences. He works as an associate professor at the School of Physics, Sun Yat-sen University, China. His research interests include the origin of the computational hardness of the binary perceptron model, the theory of dimension reduction in deep neural networks, and inherent symmetry breaking in unsupervised learning. In 2021, he was awarded Excellent Young Scientists Fund by National Natural Science Foundation of China.

Textul de pe ultima copertă

This book highlights a comprehensive introduction to the fundamental statistical mechanics underneath the inner workings of neural networks. The book discusses in details important concepts and techniques including the cavity method, the mean-field theory, replica techniques, the Nishimori condition, variational methods, the dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, the chaos theory of recurrent neural networks, and eigen-spectrums of neural networks, walking new learners through the theories and must-have skillsets to understand and use neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions. It is a good reference for students, researchers, and practitioners in the area of neural networks.

Caracteristici

Presents major theoretical tools for the analysis of neural networks
Provides concrete examples for the use of the theories in neural networks
Bridges old tools and frontiers in the theoretical development of neural networks