Convex Optimization with Computational Errors: Springer Optimization and Its Applications, cartea 155
Autor Alexander J. Zaslavskien Limba Engleză Paperback – feb 2021
The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general.
It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book.
This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted tominimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex.
The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization to engineering and economics.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 572.54 lei 43-57 zile | |
Springer International Publishing – feb 2021 | 572.54 lei 43-57 zile | |
Hardback (1) | 526.47 lei 38-44 zile | |
Springer International Publishing – feb 2020 | 526.47 lei 38-44 zile |
Din seria Springer Optimization and Its Applications
- 15% Preț: 623.61 lei
- 15% Preț: 645.58 lei
- 14% Preț: 979.50 lei
- 17% Preț: 360.79 lei
- 13% Preț: 461.35 lei
- Preț: 339.21 lei
- 18% Preț: 765.32 lei
- Preț: 370.04 lei
- 23% Preț: 624.04 lei
- 18% Preț: 1100.76 lei
- 15% Preț: 627.93 lei
- 15% Preț: 628.87 lei
- Preț: 389.09 lei
- Preț: 369.05 lei
- Preț: 383.88 lei
- Preț: 388.46 lei
- Preț: 530.66 lei
- 15% Preț: 628.87 lei
- Preț: 379.14 lei
- 15% Preț: 631.25 lei
- 24% Preț: 611.54 lei
- 18% Preț: 944.70 lei
- 20% Preț: 585.90 lei
- 15% Preț: 689.75 lei
- 18% Preț: 721.34 lei
- 15% Preț: 639.37 lei
- 15% Preț: 640.14 lei
- 18% Preț: 723.80 lei
- 15% Preț: 685.11 lei
Preț: 572.54 lei
Preț vechi: 673.58 lei
-15% Nou
Puncte Express: 859
Preț estimativ în valută:
109.58€ • 114.21$ • 91.22£
109.58€ • 114.21$ • 91.22£
Carte tipărită la comandă
Livrare economică 06-20 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783030378240
ISBN-10: 3030378241
Pagini: 360
Ilustrații: XI, 360 p. 150 illus.
Dimensiuni: 155 x 235 x 23 mm
Greutate: 0.57 kg
Ediția:1st ed. 2020
Editura: Springer International Publishing
Colecția Springer
Seria Springer Optimization and Its Applications
Locul publicării:Cham, Switzerland
ISBN-10: 3030378241
Pagini: 360
Ilustrații: XI, 360 p. 150 illus.
Dimensiuni: 155 x 235 x 23 mm
Greutate: 0.57 kg
Ediția:1st ed. 2020
Editura: Springer International Publishing
Colecția Springer
Seria Springer Optimization and Its Applications
Locul publicării:Cham, Switzerland
Cuprins
Preface.- 1. Introduction.- 2. Subgradient Projection Algorithm.- 3. The Mirror Descent Algorithm.- 4. Gradient Algorithm with a Smooth Objective Function.- 5. An Extension of the Gradient Algorithm.- 6. Continuous Subgradient Method.- 7. An optimization problems with a composite objective function.- 8. A zero-sum game with two-players.- 9. PDA-based method for convex optimization.- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems.- References. -Index.
Notă biografică
Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel.
Textul de pe ultima copertă
This book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed.
The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps—a calculationof a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different.
The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.
The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps—a calculationof a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different.
The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.
Caracteristici
Studies the influence of computational errors in numerical optimization, for minimization problems on unbounded sets, and time zero-sum games with two players Explains that for every algorithm its iteration consists of several steps and that computational errors for different steps are different Provides modern and interesting developments in the field