Cantitate/Preț
Produs

Numerical Optimization with Computational Errors: Springer Optimization and Its Applications, cartea 108

Autor Alexander J. Zaslavski
en Limba Engleză Hardback – 3 mai 2016
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors  are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.
 
This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.
  
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 53576 lei  38-44 zile
  Springer International Publishing – 27 mai 2018 53576 lei  38-44 zile
Hardback (1) 62902 lei  3-5 săpt.
  Springer International Publishing – 3 mai 2016 62902 lei  3-5 săpt.

Din seria Springer Optimization and Its Applications

Preț: 62902 lei

Preț vechi: 74003 lei
-15% Nou

Puncte Express: 944

Preț estimativ în valută:
12039 12514$ 9973£

Carte disponibilă

Livrare economică 14-28 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783319309200
ISBN-10: 331930920X
Pagini: 304
Ilustrații: IX, 304 p.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.62 kg
Ediția:1st ed. 2016
Editura: Springer International Publishing
Colecția Springer
Seria Springer Optimization and Its Applications

Locul publicării:Cham, Switzerland

Cuprins

1. Introduction.- 2. Subgradient Projection Algorithm.- 3. The Mirror Descent Algorithm.- 4. Gradient Algorithm with a Smooth Objective Function.- 5. An Extension of the Gradient Algorithm.- 6. Weiszfeld's Method.- 7. The Extragradient Method for Convex Optimization.- 8. A Projected Subgradient Method for Nonsmooth Problems.- 9. Proximal Point Method in Hilbert Spaces.- 10. Proximal Point Methods in Metric Spaces.- 11. Maximal Monotone Operators and the Proximal Point Algorithm.- 12. The Extragradient Method for Solving Variational Inequalities.- 13. A Common Solution of a Family of Variational Inequalities.- 14. Continuous Subgradient Method.- 15. Penalty Methods.- 16. Newton's method.- References.- Index. 

Recenzii

“The author studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space. Researchers and students will find this book instructive and informative. The book has contains 16 chapters … .” (Hans Benker, zbMATH 1347.65112, 2016)

Textul de pe ultima copertă

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. 


This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.

Caracteristici

Examines approximate solutions of optimization problems in the presence of computational errors Reinforces basic principles with an introductory chapter Analyzes the gradient projection algorithm for minimization of convex and smooth functions Includes supplementary material: sn.pub/extras