Cantitate/Preț
Produs

Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik

Editat de Bernhard Schölkopf, Zhiyuan Luo, Vladimir Vovk
en Limba Engleză Hardback – 2 ian 2014
This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning.
 
Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method.
 
The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection.These contributions include historical and context notes, short surveys, and comments on future research directions.
 
This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 32052 lei  6-8 săpt.
  Springer Berlin, Heidelberg – 23 aug 2016 32052 lei  6-8 săpt.
Hardback (1) 32513 lei  6-8 săpt.
  Springer Berlin, Heidelberg – 2 ian 2014 32513 lei  6-8 săpt.

Preț: 32513 lei

Preț vechi: 40641 lei
-20% Nou

Puncte Express: 488

Preț estimativ în valută:
6223 6565$ 5186£

Carte tipărită la comandă

Livrare economică 02-16 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783642411359
ISBN-10: 3642411355
Pagini: 306
Ilustrații: XIX, 287 p. 33 illus., 26 illus. in color.
Dimensiuni: 155 x 235 x 22 mm
Greutate: 0.57 kg
Ediția:2013
Editura: Springer Berlin, Heidelberg
Colecția Springer
Locul publicării:Berlin, Heidelberg, Germany

Public țintă

Research

Cuprins

Part I - History of Statistical Learning Theory.- Chap. 1 - In Hindsight: Doklady Akademii Nauk SSSR, 181(4), 1968.- Chap. 2 - On the Uniform Convergence of the Frequencies of Occurrence of Events to Their Probabilities.- Chap. 3 - Early History of Support Vector Machines.- Part II - Theory and Practice of Statistical Learning Theory.- Chap. 4 - Some Remarks on the Statistical Analysis of SVMs and Related Methods.- Chap. 5 - Explaining AdaBoost.- Chap. 6 - On the Relations and Differences Between Popper Dimension, Exclusion Dimension and VC-Dimension.- Chap. 7 - On Learnability, Complexity and Stability.- Chap. 8 - Loss Functions.- Chap. 9 - Statistical Learning Theory in Practice.- Chap. 10 - PAC-Bayesian Theory.- Chap. 11 - Kernel Ridge Regression.- Chap. 12 - Multi-task Learning for Computational Biology: Overview and Outlook.- Chap. 13 - Semi-supervised Learning in Causal and Anticausal Settings.- Chap. 14 - Strong Universal Consistent Estimate of the Minimum Mean-Squared Error.- Chap. 15 - The Median Hypothesis.- Chap. 16 - Efficient Transductive Online Learning via Randomized Rounding.- Chap. 17 - Pivotal Estimation in High-Dimensional Regression via Linear Programming.- Chap. 18 - Some Observations on Sparsity Inducing Regularization Methods for Machine Learning.- Chap. 19 - Sharp Oracle Inequalities in Low Rank Estimation.- Chap. 20 - On the Consistency of the Bootstrap Approach for Support Vector Machines and Related Kernel-Based Methods.- Chap. 21 - Kernels, Pre-images and Optimization.- Chap. 22 - Efficient Learning of Sparse Ranking Functions.- Chap. 23 - Direct Approximation of Divergences Between Probability Distributions.- Index.

Textul de pe ultima copertă

This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning.
 
Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method.
 
The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection.These contributions include historical and context notes, short surveys, and comments on future research directions.
 
This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.

Caracteristici

Honours one of the pioneers of machine learning Contributing authors are among the leading authorities in these domains Of interest to researchers and engineers in the fields of machine learning, statistics, and optimization Includes supplementary material: sn.pub/extras