Cantitate/Preț
Produs

A Probabilistic Theory of Pattern Recognition: Stochastic Modelling and Applied Probability, cartea 31

Autor Luc Devroye, Laszlo Györfi, Gabor Lugosi
en Limba Engleză Hardback – 4 apr 1996
Pattern recognition presents one of the most significant challenges for scientists and engineers, and many different approaches have been proposed. The aim of this book is to provide a self-contained account of probabilistic analysis of these approaches. The book includes a discussion of distance measures, nonparametric methods based on kernels or nearest neighbors, Vapnik-Chervonenkis theory, epsilon entropy, parametric classification, error estimation, free classifiers, and neural networks. Wherever possible, distribution-free properties and inequalities are derived. A substantial portion of the results or the analysis is new. Over 430 problems and exercises complement the material.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 58962 lei  6-8 săpt.
  Springer – 22 noi 2013 58962 lei  6-8 săpt.
Hardback (1) 78868 lei  6-8 săpt.
  Springer – 4 apr 1996 78868 lei  6-8 săpt.

Din seria Stochastic Modelling and Applied Probability

Preț: 78868 lei

Preț vechi: 96180 lei
-18% Nou

Puncte Express: 1183

Preț estimativ în valută:
15095 15691$ 12504£

Carte tipărită la comandă

Livrare economică 05-19 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9780387946184
ISBN-10: 0387946187
Pagini: 638
Ilustrații: XV, 638 p.
Dimensiuni: 155 x 235 x 41 mm
Greutate: 1.1 kg
Ediția:1996
Editura: Springer
Colecția Springer
Seria Stochastic Modelling and Applied Probability

Locul publicării:New York, NY, United States

Public țintă

Research

Cuprins

Preface * Introduction * The Bayes Error * Inequalities and alternate
distance measures * Linear discrimination * Nearest neighbor rules *
Consistency * Slow rates of convergence Error estimation * The regular
histogram rule * Kernel rules Consistency of the k-nearest neighbor
rule * Vapnik-Chervonenkis theory * Combinatorial aspects of Vapnik-
Chervonenkis theory * Lower bounds for empirical classifier selection
* The maximum likelihood principle * Parametric classification *
Generalized linear discrimination * Complexity regularization *
Condensed and edited nearest neighbor rules * Tree classifiers * Data-
dependent partitioning * Splitting the data * The resubstitution
estimate * Deleted estimates of the error probability * Automatic
kernel rules * Automatic nearest neighbor rules * Hypercubes and
discrete spaces * Epsilon entropy and totally bounded sets * Uniform
laws of large numbers * Neural networks * Other error estimates *
Feature extraction * Appendix * Notation * References * Index