Cantitate/Preț
Produs

Large-Scale Convex Optimization: Algorithms & Analyses via Monotone Operators

Autor Ernest K. Ryu, Wotao Yin
en Limba Engleză Hardback – 30 noi 2022
Starting from where a first course in convex optimization leaves off, this text presents a unified analysis of first-order optimization methods – including parallel-distributed algorithms – through the abstraction of monotone operators. With the increased computational power and availability of big data over the past decade, applied disciplines have demanded that larger and larger optimization problems be solved. This text covers the first-order convex optimization methods that are uniquely effective at solving these large-scale optimization problems. Readers will have the opportunity to construct and analyze many well-known classical and modern algorithms using monotone operators, and walk away with a solid understanding of the diverse optimization algorithms. Graduate students and researchers in mathematical optimization, operations research, electrical engineering, statistics, and computer science will appreciate this concise introduction to the theory of convex optimization algorithms.
Citește tot Restrânge

Preț: 44313 lei

Preț vechi: 49790 lei
-11% Nou

Puncte Express: 665

Preț estimativ în valută:
8479 8955$ 7057£

Carte tipărită la comandă

Livrare economică 11-25 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781009160858
ISBN-10: 1009160850
Pagini: 400
Dimensiuni: 178 x 254 x 19 mm
Greutate: 0.77 kg
Ediția:Nouă
Editura: Cambridge University Press
Colecția Cambridge University Press
Locul publicării:New York, United States

Cuprins

Preface; 1. Introduction and preliminaries; Part I. Monotone Operator Methods: 2. Monotone operators and base splitting schemes; 3. Primal-dual splitting methods; 4. Parallel computing; 5. Randomized coordinate update methods; 6. Asynchronous coordinate update methods; Part II. Additional Topics: 7. Stochastic optimization; 8. ADMM-type methods; 9. Duality in splitting methods; 10. Maximality and monotone operator theory; 11. Distributed and decentralized optimization; 12. Acceleration; 13. Scaled relative graphs; Appendices; References; Index.

Recenzii

'Ryu and Yin's Large-Scale Convex Optimization does a great job of covering a field with a long history and much current interest. The book describes dozens of algorithms, from classic ones developed in the 1970s to some very recent ones, in unified and consistent notation, all organized around the basic concept and unifying theme of a monotone operator. I strongly recommend it to any mathematician, researcher, or engineer who uses, or has an interest in, convex optimization.' Stephen Boyd, Stanford University
'This is an absolute must-read research monograph for signal processing, communications, and networking engineers, as well as researchers who wish to choose, design, and analyze splitting-based convex optimization methods best suited for their perplexed and challenging engineering tasks.' Georgios B. Giannakis, University of Minnesota
'This is a very timely book. Monotone operator theory is fundamental to the development of modern algorithms for large-scale convex optimization. Ryu and Yin provide optimization students and researchers with a self-contained introduction to the elegant mathematical theory of monotone operators, and take their readers on a tour of cutting-edge applications, demonstrating the power and range of these essential tools.' Lieven Vandenberghe, University of California, Los Angeles
'First-order methods are the mainstream optimization algorithms in the era of big data. This monograph provides a unique perspective on various first-order convex optimization algorithms via the monotone operator theory, with which the seemingly different and unrelated algorithms are actually deeply connected, and many proofs can be significantly simplified. The book is a beautiful example of the power of abstraction. Those who are interested in convex optimization theory should not miss this book.' Zhouchen Lin, Peking University
'The book covers topics from the basics of optimization to modern techniques such as operator splitting, parallel and distributed optimization, and stochastic algorithms. It is the natural next step after Boyd and Vandenberghe's Convex Optimization for students studying optimization and machine learning. The authors are experts in this kind of optimization. Some of my graduate students took the course based on this book when Wotao Yin was at UCLA. They liked the course and found the materials very useful in their research.' Stanley Osher, University of California, Los Angeles

Notă biografică


Descriere

A unified analysis of first-order optimization methods, including parallel-distributed algorithms, using monotone operators.