Accelerated Optimization for Machine Learning: First-Order Algorithms
Autor Zhouchen Lin, Huan Li, Cong Fangen Limba Engleză Paperback – 30 mai 2021
Written by leading experts in the field, this book provides a comprehensive introduction to, and state-of-the-art review of accelerated first-order optimization algorithms for machine learning. It discusses a variety of methods, including deterministic and stochastic algorithms, where the algorithms can be synchronous or asynchronous, for unconstrained and constrained problems, which can be convex or non-convex. Offering a rich blend of ideas, theories and proofs, the book is up-to-date and self-contained. It is an excellent reference resource for users who are seeking faster optimization algorithms, as well asfor graduate students and researchers wanting to grasp the frontiers of optimization in machine learning in a short time.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 949.76 lei 6-8 săpt. | |
Springer Nature Singapore – 30 mai 2021 | 949.76 lei 6-8 săpt. | |
Hardback (1) | 957.37 lei 6-8 săpt. | |
Springer Nature Singapore – 30 mai 2020 | 957.37 lei 6-8 săpt. |
Preț: 949.76 lei
Preț vechi: 1187.19 lei
-20% Nou
Puncte Express: 1425
Preț estimativ în valută:
181.77€ • 191.76$ • 151.48£
181.77€ • 191.76$ • 151.48£
Carte tipărită la comandă
Livrare economică 02-16 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9789811529122
ISBN-10: 9811529124
Ilustrații: XXIV, 275 p. 36 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.42 kg
Ediția:1st ed. 2020
Editura: Springer Nature Singapore
Colecția Springer
Locul publicării:Singapore, Singapore
ISBN-10: 9811529124
Ilustrații: XXIV, 275 p. 36 illus.
Dimensiuni: 155 x 235 mm
Greutate: 0.42 kg
Ediția:1st ed. 2020
Editura: Springer Nature Singapore
Colecția Springer
Locul publicării:Singapore, Singapore
Cuprins
Chapter 1. Introduction.- Chapter 2. Accelerated Algorithms for Unconstrained Convex Optimization.- Chapter 3. Accelerated Algorithms for Constrained Convex Optimization.- Chapter 4. Accelerated Algorithms for Nonconvex Optimization.- Chapter 5. Accelerated Stochastic Algorithms.- Chapter 6. Accelerated Paralleling Algorithms.- Chapter 7. Conclusions.-
Notă biografică
Zhouchen Lin is a leading expert in the fields of machine learning and computer vision. He is currently a Professor at the Key Laboratory of Machine Perception (Ministry of Education), School of EECS, Peking University. He served as an area chair for several prestigious conferences, including CVPR, ICCV, ICML, NIPS, AAAI and IJCAI. He is an associate editor of the IEEE Transactions on Pattern Analysis and Machine Intelligence and the International Journal of Computer Vision. He is a Fellow of IAPR and IEEE.
Huan Li received his Ph.D. degree in machine learning from Peking University in 2019. He is currently an Assistant Professor at the College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics. His current research interests include optimization and machine learning.
Cong Fang received his Ph.D. degree from Peking University in 2019. He is currently a Postdoctoral Researcher at Princeton University. His research interests include machine learning and optimization.
Huan Li received his Ph.D. degree in machine learning from Peking University in 2019. He is currently an Assistant Professor at the College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics. His current research interests include optimization and machine learning.
Cong Fang received his Ph.D. degree from Peking University in 2019. He is currently a Postdoctoral Researcher at Princeton University. His research interests include machine learning and optimization.
Textul de pe ultima copertă
This book on optimization includes forewords by Michael I. Jordan, Zongben Xu and Zhi-Quan Luo. Machine learning relies heavily on optimization to solve problems with its learning models, and first-order optimization algorithms are the mainstream approaches. The acceleration of first-order optimization algorithms is crucial for the efficiency of machine learning.
Written by leading experts in the field, this book provides a comprehensive introduction to, and state-of-the-art review of accelerated first-order optimization algorithms for machine learning. It discusses a variety of methods, including deterministic and stochastic algorithms, where the algorithms can be synchronous or asynchronous, for unconstrained and constrained problems, which can be convex or non-convex. Offering a rich blend of ideas, theories and proofs, the book is up-to-date and self-contained. It is an excellent reference resource for users who are seeking faster optimization algorithms, as well as for graduate students and researchers wanting to grasp the frontiers of optimization in machine learning in a short time.
Written by leading experts in the field, this book provides a comprehensive introduction to, and state-of-the-art review of accelerated first-order optimization algorithms for machine learning. It discusses a variety of methods, including deterministic and stochastic algorithms, where the algorithms can be synchronous or asynchronous, for unconstrained and constrained problems, which can be convex or non-convex. Offering a rich blend of ideas, theories and proofs, the book is up-to-date and self-contained. It is an excellent reference resource for users who are seeking faster optimization algorithms, as well as for graduate students and researchers wanting to grasp the frontiers of optimization in machine learning in a short time.
Caracteristici
The first monograph on accelerated first-order optimization algorithms used in machine learning Includes forewords by Michael I. Jordan, Zongben Xu, and Zhi-Quan Luo, and written by experts on machine learning and optimization Is comprehensive, up-to-date, and self-contained, making it is easy for beginners to grasp the frontiers of optimization in machine learning