Cantitate/Preț
Produs

Learning Theory and Kernel Machines: 16th Annual Conference on Computational Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003, Washington, DC, USA, August 24-27, 2003, Proceedings: Lecture Notes in Computer Science, cartea 2777

Editat de Bernhard Schölkopf, Manfred K. Warmuth
en Limba Engleză Paperback – 11 aug 2003

Din seria Lecture Notes in Computer Science

Preț: 65747 lei

Preț vechi: 82184 lei
-20% Nou

Puncte Express: 986

Preț estimativ în valută:
12583 13070$ 10452£

Carte tipărită la comandă

Livrare economică 03-17 februarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783540407201
ISBN-10: 3540407200
Pagini: 768
Ilustrații: XIV, 754 p.
Dimensiuni: 155 x 235 x 39 mm
Greutate: 1.06 kg
Ediția:2003
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seriile Lecture Notes in Computer Science, Lecture Notes in Artificial Intelligence

Locul publicării:Berlin, Heidelberg, Germany

Public țintă

Research

Cuprins

Target Area: Computational Game Theory.- Tutorial: Learning Topics in Game-Theoretic Decision Making.- A General Class of No-Regret Learning Algorithms and Game-Theoretic Equilibria.- Preference Elicitation and Query Learning.- Efficient Algorithms for Online Decision Problems.- Positive Definite Rational Kernels.- Bhattacharyya and Expected Likelihood Kernels.- Maximal Margin Classification for Metric Spaces.- Maximum Margin Algorithms with Boolean Kernels.- Knowledge-Based Nonlinear Kernel Classifiers.- Fast Kernels for Inexact String Matching.- On Graph Kernels: Hardness Results and Efficient Alternatives.- Kernels and Regularization on Graphs.- Data-Dependent Bounds for Multi-category Classification Based on Convex Losses.- Poster Session 1.- Comparing Clusterings by the Variation of Information.- Multiplicative Updates for Large Margin Classifiers.- Simplified PAC-Bayesian Margin Bounds.- Sparse Kernel Partial Least Squares Regression.- Sparse Probability Regression by Label Partitioning.- Learning with Rigorous Support Vector Machines.- Robust Regression by Boosting the Median.- Boosting with Diverse Base Classifiers.- Reducing Kernel Matrix Diagonal Dominance Using Semi-definite Programming.- Optimal Rates of Aggregation.- Distance-Based Classification with Lipschitz Functions.- Random Subclass Bounds.- PAC-MDL Bounds.- Universal Well-Calibrated Algorithm for On-Line Classification.- Learning Probabilistic Linear-Threshold Classifiers via Selective Sampling.- Learning Algorithms for Enclosing Points in Bregmanian Spheres.- Internal Regret in On-Line Portfolio Selection.- Lower Bounds on the Sample Complexity of Exploration in the Multi-armed Bandit Problem.- Smooth ?-Insensitive Regression by Loss Symmetrization.- On Finding Large Conjunctive Clusters.- LearningArithmetic Circuits via Partial Derivatives.- Poster Session 2.- Using a Linear Fit to Determine Monotonicity Directions.- Generalization Bounds for Voting Classifiers Based on Sparsity and Clustering.- Sequence Prediction Based on Monotone Complexity.- How Many Strings Are Easy to Predict?.- Polynomial Certificates for Propositional Classes.- On-Line Learning with Imperfect Monitoring.- Exploiting Task Relatedness for Multiple Task Learning.- Approximate Equivalence of Markov Decision Processes.- An Information Theoretic Tradeoff between Complexity and Accuracy.- Learning Random Log-Depth Decision Trees under the Uniform Distribution.- Projective DNF Formulae and Their Revision.- Learning with Equivalence Constraints and the Relation to Multiclass Learning.- Target Area: Natural Language Processing.- Tutorial: Machine Learning Methods in Natural Language Processing.- Learning from Uncertain Data.- Learning and Parsing Stochastic Unification-Based Grammars.- Generality’s Price.- On Learning to Coordinate.- Learning All Subfunctions of a Function.- When Is Small Beautiful?.- Learning a Function of r Relevant Variables.- Subspace Detection: A Robust Statistics Formulation.- How Fast Is k-Means?.- Universal Coding of Zipf Distributions.- An Open Problem Regarding the Convergence of Universal A Priori Probability.- Entropy Bounds for Restricted Convex Hulls.- Compressing to VC Dimension Many Points.

Caracteristici

Includes supplementary material: sn.pub/extras