Cantitate/Preț
Produs

Mathematical Theories of Machine Learning - Theory and Applications

Autor Bin Shi, S. S. Iyengar
en Limba Engleză Paperback – 14 aug 2020
This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. 
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 56541 lei  43-57 zile
  Springer International Publishing – 14 aug 2020 56541 lei  43-57 zile
Hardback (1) 57148 lei  43-57 zile
  Springer International Publishing – 26 iun 2019 57148 lei  43-57 zile

Preț: 56541 lei

Preț vechi: 66520 lei
-15% Nou

Puncte Express: 848

Preț estimativ în valută:
10821 11240$ 8988£

Carte tipărită la comandă

Livrare economică 03-17 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783030170783
ISBN-10: 3030170780
Pagini: 133
Ilustrații: XXI, 133 p. 25 illus., 24 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.23 kg
Ediția:1st ed. 2020
Editura: Springer International Publishing
Colecția Springer
Locul publicării:Cham, Switzerland

Cuprins

Chapter 1. Introduction.- Chapter 2. General Framework of Mathematics.- Chapter 3. Problem Formulation.- Chapter 4. Development of Novel Techniques of CoCoSSC Method.- Chapter 5. Further Discussions of the Proposed Method.- Chapter 6. Related Work on Geometry of Non-Convex Programs.- Chapter 7. Gradient Descent Converges to Minimizers.- Chapter 8. A Conservation Law Method Based on Optimization.- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations.- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series.- Chapter 11. Conclusion.

Recenzii

“The book discusses mathematical theories of machine learning. … The book is very technically written and it is addressed to professionals in the field.” (Smaranda Belciug, zbMATH 1422.68003, 2019)

Notă biografică

Bin Shi is a Ph.D. candidate in the School of Computing and Information Sciences at FIU under the supervision of Professor Sitharama S. Iyengar. His preliminary research focuses on the theory of machine learning, especially on optimization. Bin Shi received his B.S. of Applied Math from Ocean University of China, China in 2006, Master of Pure Mathematics from Fudan University, China in 2011 and Master of Theoretical Physics from University of Massachusetts Dartmouth in 2015. His research interests focus on statistical machine learning and optimization, some theoretical computer science.

Dr. S.S. Iyengar is the Distinguished University Professor, Ryder Professor of Computer Science and Director of the School of Computing and Information Sciences at Florida International University (FIU), Miami. He is also the founding director of the Discovery Lab. Prior to joining FIU, Dr. Iyengar was the Roy Paul Daniel's Distinguished Professor and Chairman of theComputer Science department for over 20 years at Lousiana State University. He has also worked as a visiting scientist at Oak Ridge National Lab, Jet Propulsion Lab, Satish Dhawan Professor at IISc and Homi Bhabha Professor at IGCAR, Kalpakkam and University of Paris and visited Tsinghua University, Korea Advanced Institute of Science and Technology (KAIST) etc. Professor Iyengar is an IEEE Distinguished Visitor, SIAM Distinguished Lecturer, and ACM National Lecturer and has won many other awards like Distinguished Research Master's award, Hub Cotton award of Faculty Excellence (LSU), Rain Maker awards (LSU), Florida Information Technology award (IT2), Distinguished Research award from Tunisian Mathematical Society etc. During the last four decades, he has supervised over 55 Ph.D. students, 100 Master's students, and many undergraduate students who are now faculty at Major Universities worldwide or Scientists or Engineers at National Labs/Industries around the world. He has publishedmore than 500 research papers, has authored/co-authored and edited 22 books. His books are published by MIT Press, John Wiley, and Sons, CRC Press, Prentice Hall, Springer Verlag, IEEE Computer Society Press, etc. One of his books titled \Introduction to Parallel Algorithms" has been translated to Chinese.

Textul de pe ultima copertă

This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. 
  • Provides a thorough look into the variety of mathematical theories of machine learning
  • Presented in four parts, allowing for readers to easily navigate the complex theories 
  • Includes extensive empirical studies on both the synthetic and real application time series data

Caracteristici

Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data