Statistical Learning Theory and Stochastic Optimization: Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001: Lecture Notes in Mathematics, cartea 1851
Autor Olivier Catoni Editat de Jean Picarden Limba Engleză Paperback – 25 aug 2004
Din seria Lecture Notes in Mathematics
- 17% Preț: 360.42 lei
- Preț: 117.90 lei
- 20% Preț: 380.29 lei
- Preț: 131.65 lei
- Preț: 446.82 lei
- Preț: 175.68 lei
- Preț: 477.63 lei
- 17% Preț: 361.88 lei
- Preț: 252.37 lei
- Preț: 343.94 lei
- Preț: 138.88 lei
- Preț: 152.60 lei
- Preț: 116.67 lei
- Preț: 102.77 lei
- Preț: 119.02 lei
- 17% Preț: 365.52 lei
- Preț: 396.75 lei
- 17% Preț: 362.12 lei
- Preț: 396.11 lei
- Preț: 357.78 lei
- 17% Preț: 362.31 lei
- Preț: 403.80 lei
- 17% Preț: 361.70 lei
- Preț: 485.61 lei
- Preț: 444.01 lei
- Preț: 395.90 lei
- Preț: 473.67 lei
- Preț: 411.94 lei
- Preț: 473.67 lei
- Preț: 321.16 lei
- Preț: 316.50 lei
- Preț: 340.35 lei
- Preț: 321.90 lei
- Preț: 396.74 lei
- Preț: 318.94 lei
- Preț: 409.29 lei
- Preț: 268.12 lei
- Preț: 412.51 lei
- Preț: 410.03 lei
- Preț: 490.60 lei
- Preț: 410.03 lei
- Preț: 267.00 lei
- Preț: 325.66 lei
- Preț: 410.25 lei
- Preț: 483.29 lei
- Preț: 264.95 lei
- Preț: 415.85 lei
- Preț: 365.51 lei
- Preț: 414.95 lei
- Preț: 316.66 lei
Preț: 376.93 lei
Nou
Puncte Express: 565
Preț estimativ în valută:
72.13€ • 75.87$ • 60.09£
72.13€ • 75.87$ • 60.09£
Carte tipărită la comandă
Livrare economică 04-18 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783540225720
ISBN-10: 3540225722
Pagini: 292
Ilustrații: VIII, 284 p.
Dimensiuni: 155 x 235 x 15 mm
Greutate: 0.41 kg
Ediția:2004
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seriile Lecture Notes in Mathematics, École d'Été de Probabilités de Saint-Flour
Locul publicării:Berlin, Heidelberg, Germany
ISBN-10: 3540225722
Pagini: 292
Ilustrații: VIII, 284 p.
Dimensiuni: 155 x 235 x 15 mm
Greutate: 0.41 kg
Ediția:2004
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seriile Lecture Notes in Mathematics, École d'Été de Probabilités de Saint-Flour
Locul publicării:Berlin, Heidelberg, Germany
Public țintă
ResearchCuprins
Universal Lossless Data Compression.- Links Between Data Compression and Statistical Estimation.- Non Cumulated Mean Risk.- Gibbs Estimators.- Randomized Estimators and Empirical Complexity.- Deviation Inequalities.- Markov Chains with Exponential Transitions.- References.- Index.
Recenzii
From the reviews:
"This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical interference. … The book is perhaps the first ever compendium of this circle of ideas and will be a valuable resource for researchers in information theory, statistical learning theory and statistical inference." (Vivek S. Borkar, Mathematical Reviews, Issue 2006 d)
"This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical interference. … The book is perhaps the first ever compendium of this circle of ideas and will be a valuable resource for researchers in information theory, statistical learning theory and statistical inference." (Vivek S. Borkar, Mathematical Reviews, Issue 2006 d)
Textul de pe ultima copertă
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
Caracteristici
Includes supplementary material: sn.pub/extras