Cantitate/Preț
Produs

Beyond the Worst-Case Analysis of Algorithms

Editat de Tim Roughgarden
en Limba Engleză Hardback – 13 ian 2021
There are no silver bullets in algorithm design, and no single algorithmic idea is powerful and flexible enough to solve every computational problem. Nor are there silver bullets in algorithm analysis, as the most enlightening method for analyzing an algorithm often depends on the problem and the application. However, typical algorithms courses rely almost entirely on a single analysis framework, that of worst-case analysis, wherein an algorithm is assessed by its worst performance on any input of a given size. The purpose of this book is to popularize several alternatives to worst-case analysis and their most notable algorithmic applications, from clustering to linear programming to neural network training. Forty leading researchers have contributed introductions to different facets of this field, emphasizing the most important models and results, many of which can be taught in lectures to beginning graduate students in theoretical computer science and machine learning.
Citește tot Restrânge

Preț: 43343 lei

Preț vechi: 54179 lei
-20% Nou

Puncte Express: 650

Preț estimativ în valută:
8296 8709$ 6853£

Carte tipărită la comandă

Livrare economică 30 ianuarie-13 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781108494311
ISBN-10: 1108494315
Pagini: 704
Dimensiuni: 188 x 260 x 40 mm
Greutate: 1.41 kg
Editura: Cambridge University Press
Colecția Cambridge University Press
Locul publicării:New York, United States

Cuprins

Forward; Preface; 1. Introduction Tim Roughgarden; Part I. Refinements of Worst-Case Analysis: 2. Parameterized algorithms Fedor Fomin, Daniel Lokshtanov, Saket Saurabh, and Meirav Zehavi; 3. From adaptive analysis to instance optimality Jérémy Barbay; 4. Resource augmentation Tim Roughgarden; Part II. Deterministic Models of Data: 5. Perturbation resilience Konstantin Makarychev and Yury Makarychev; 6. Approximation stability and proxy objectives Avrim Blum; 7. Sparse recovery Eric Price; Part III. Semi-Random Models: 8. Distributional analysis Tim Roughgarden; 9. Introduction to semi-random models Uriel Feige; 10. Semi-random stochastic block models Ankur Moitra; 11. Random-order models Anupam Gupta and Sahil Singla; 12. Self-improving algorithms C. Seshadhri; Part IV. Smoothed Analysis: 13. Smoothed analysis of local search Bodo Manthey; 14. Smoothed analysis of the simplex method Daniel Dadush and Sophie Huiberts; 15. Smoothed analysis of Pareto curves in multiobjective optimization Heiko Röglin; Part V. Applications in Machine Learning and Statistics: 16. Noise in classification Maria-Florina Balcan and Nika Haghtalab; 17. Robust high-dimensional statistics Ilias Diakonikolas and Daniel Kane; 18. Nearest-neighbor classification and search Sanjoy Dasgupta and Samory Kpotufe; 19. Efficient tensor decomposition Aravindan Vijayaraghavan; 20. Topic models and nonnegative matrix factorization Rong Ge and Ankur Moitra; 21. Why do local methods solve nonconvex problems? Tengyu Ma; 22. Generalization in overparameterized models Moritz Hardt; 23. Instance-optimal distribution testing and learning Gregory Valiant and Paul Valiant; Part VI. Further Applications: 24. Beyond competitive analysis Anna R. Karlin and Elias Koutsoupias; 25. On the unreasonable effectiveness of satisfiability solvers Vijay Ganesh and Moshe Vardi; 26. When simple hash functions suffice Kai-Min Chung, Michael Mitzenmacher and Salil Vadhan; 27. Prior-independent auctions Inbal Talgam-Cohen; 28. Distribution-free models of social networks Tim Roughgarden and C. Seshadhri; 29. Data-driven algorithm design Maria-Florina Balcan; 30. Algorithms with predictions Michael Mitzenmacher and Sergei Vassilvitskii.

Recenzii

'Many important algorithmic problems are considered intractable according to the conventional worst-case metrics of computational complexity theory. This important book demonstrates that, for many such problems, it is possible to craft algorithms that perform well under plausible assumptions about the structure of the inputs that are likely to be presented. It may well mark a turning point in the field of algorithm design and analysis.' Richard M. Karp, University of California at Berkeley
'The worst-case analysis sets a criteria for perfect algorithmic performance. It has led and will continue to lead to the creation of breakthrough algorithms unthinkable by previous generations. But the success of worst-case analysis as the main theoretical computing framework has also placed provably-good algorithm design in a quandary, because nearly all practically significant problems have been shown to be intractable under such perfect criteria. Going beyond the worst-case analysis is a much-needed step for the theory of computing. This book - broad in scope and united by a common theme - represents diverse efforts in the field, and will elevate this fundamental subject for connecting computing theory with the rapid advances in Big Data and AI Solutions.' Shanghua Teng, University of Southern California
'The book is a must have for any aspiring algorithm researcher … Essential.' D. Papamichail, Choice Magazine

Descriere

Introduces exciting new methods for assessing algorithms for problems ranging from clustering to linear programming to neural networks.