Cantitate/Preț
Produs

Improved Classification Rates for Localized Algorithms under Margin Conditions

Autor Ingrid Karin Blaschzyk
en Limba Engleză Paperback – 19 mar 2020
Support vector machines (SVMs) are one of the most successful algorithms on small and medium-sized data sets, but on large-scale data sets their training and predictions become computationally infeasible. The author considers a spatially defined data chunking method for large-scale learning problems, leading to so-called localized SVMs, and implements an in-depth mathematical analysis with theoretical guarantees, which in particular include classification rates. The statistical analysis relies on a new and simple partitioning based technique and takes well-known margin conditions into account that describe the behavior of the data-generating distribution. It turns out that the rates outperform known rates of several other learning algorithms under suitable sets of assumptions. From a practical point of view, the author shows that a common training and validation procedure achieves the theoretical rates adaptively, that is, without knowing the margin parameters in advance.
Citește tot Restrânge

Preț: 37812 lei

Nou

Puncte Express: 567

Preț estimativ în valută:
7236 7545$ 6014£

Carte tipărită la comandă

Livrare economică 21 martie-04 aprilie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783658295905
ISBN-10: 3658295902
Pagini: 126
Ilustrații: XV, 126 p. 5 illus. in color.
Dimensiuni: 148 x 210 mm
Greutate: 0.18 kg
Ediția:1st ed. 2020
Editura: Springer Fachmedien Wiesbaden
Colecția Springer Spektrum
Locul publicării:Wiesbaden, Germany

Cuprins

Introduction to Statistical Learning Theory.- Histogram Rule: Oracle Inequality and Learning Rates.- Localized SVMs: Oracle Inequalities and Learning Rates.

Notă biografică

Ingrid Karin Blaschzyk is a postdoctoral researcher in the Department of Mathematics at the University of Stuttgart, Germany.​

Textul de pe ultima copertă

Support vector machines (SVMs) are one of the most successful algorithms on small and medium-sized data sets, but on large-scale data sets their training and predictions become computationally infeasible. The author considers a spatially defined data chunking method for large-scale learning problems, leading to so-called localized SVMs, and implements an in-depth mathematical analysis with theoretical guarantees, which in particular include classification rates. The statistical analysis relies on a new and simple partitioning based technique and takes well-known margin conditions into account that describe the behavior of the data-generating distribution. It turns out that the rates outperform known rates of several other learning algorithms under suitable sets of assumptions. From a practical point of view, the author shows that a common training and validation procedure achieves the theoretical rates adaptively, that is, without knowing the margin parameters in advance.Contents

  • Introduction to Statistical Learning Theory
  • Histogram Rule: Oracle Inequality and Learning Rates
  • Localized SVMs: Oracle Inequalities and Learning Rates
Target Groups

Researchers, students, and practitioners in the fields of mathematics and computer sciences who focus on machine learning or statistical learning theory
The Author

Ingrid Karin Blaschzyk is a postdoctoral researcher in the Department of Mathematics at the University of Stuttgart, Germany.

Caracteristici

Study in the field of natural sciences Study in the field of statistical learning theory