Algorithmic Learning Theory: 7th International Workshop, ALT '96, Sydney, Australia, October 23 - 25, 1996. Proceedings: Lecture Notes in Computer Science, cartea 1160
Editat de Setsuo Arikawa, Arun K. Sharmaen Limba Engleză Paperback – 9 oct 1996
The 16 revised full papers presented were selected from 41 submissions; also included are eight short papers as well as four full length invited contributions by Ross Quinlan, Takeshi Shinohara, Leslie Valiant, and Paul Vitanyi, and an introduction by the volume editors. The book covers all areas related to algorithmic learning theory, ranging from theoretical foundations of machine learning to applications in several areas.
Din seria Lecture Notes in Computer Science
- 20% Preț: 741.34 lei
- 20% Preț: 340.22 lei
- 20% Preț: 343.43 lei
- 20% Preț: 315.18 lei
- 20% Preț: 327.41 lei
- 20% Preț: 1031.06 lei
- 20% Preț: 438.67 lei
- 20% Preț: 315.76 lei
- 20% Preț: 330.61 lei
- 20% Preț: 148.66 lei
- 20% Preț: 122.89 lei
- 20% Preț: 995.03 lei
- 20% Preț: 562.71 lei
- 20% Preț: 237.99 lei
- 20% Preț: 504.57 lei
- 20% Preț: 332.20 lei
- 15% Preț: 563.85 lei
- 20% Preț: 636.26 lei
- 5% Preț: 365.59 lei
- 20% Preț: 321.95 lei
- 20% Preț: 310.26 lei
- 20% Preț: 607.38 lei
- Preț: 370.38 lei
- 20% Preț: 172.68 lei
- 20% Preț: 315.76 lei
- 20% Preț: 662.78 lei
- 20% Preț: 256.26 lei
- 20% Preț: 440.36 lei
- 20% Preț: 626.79 lei
- 20% Preț: 566.70 lei
- 17% Preț: 360.19 lei
- 20% Preț: 309.90 lei
- 20% Preț: 579.38 lei
- 20% Preț: 301.94 lei
- 20% Preț: 307.71 lei
- 20% Preț: 369.12 lei
- 20% Preț: 330.61 lei
- 20% Preț: 1044.38 lei
- 20% Preț: 574.58 lei
- Preț: 399.17 lei
- 20% Preț: 802.24 lei
- 20% Preț: 569.11 lei
- 20% Preț: 1374.12 lei
- 20% Preț: 333.84 lei
- 20% Preț: 538.29 lei
- 20% Preț: 326.97 lei
Preț: 326.13 lei
Preț vechi: 407.66 lei
-20% Nou
Puncte Express: 489
Preț estimativ în valută:
62.42€ • 65.06$ • 51.96£
62.42€ • 65.06$ • 51.96£
Carte tipărită la comandă
Livrare economică 06-20 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783540618638
ISBN-10: 3540618635
Pagini: 364
Ilustrații: XVII, 337 p.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.51 kg
Ediția:1996
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seriile Lecture Notes in Computer Science, Lecture Notes in Artificial Intelligence
Locul publicării:Berlin, Heidelberg, Germany
ISBN-10: 3540618635
Pagini: 364
Ilustrații: XVII, 337 p.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.51 kg
Ediția:1996
Editura: Springer Berlin, Heidelberg
Colecția Springer
Seriile Lecture Notes in Computer Science, Lecture Notes in Artificial Intelligence
Locul publicării:Berlin, Heidelberg, Germany
Public țintă
ResearchCuprins
Managing complexity in neuroidal circuits.- Learnability of exclusive-or expansion based on monotone DNF formulas.- Improved bounds about on-line learning of smooth functions of a single variable.- Query learning of bounded-width OBDDs.- Learning a representation for optimizable formulas.- Limits of exact algorithms for inference of minimum size finite state machines.- Genetic fitness optimization using rapidly mixing Markov chains.- The kindest cut: Minimum message length segmentation.- Reducing complexity of decision trees with two variable tests.- The complexity of exactly learning algebraic concepts.- Efficient learning of real time two-counter automata.- Cost-sensitive feature reduction applied to a hybrid genetic algorithm.- Effects of Feature Selection with ‘Blurring’ on neurofuzzy systems.- Boosting first-order learning.- Incorporating hypothetical knowledge into the process of inductive synthesis.- Induction of Constraint Logic Programs.- Constructive learning of translations based on dictionaries.- Inductive logic programming beyond logical implication.- Noise elimination in inductive concept learning: A case study in medical diagnosis.- MML estimation of the parameters of the spherical fisher distribution.- Learning by erasing.- On learning and co-learning of minimal programs.- Inductive inference of unbounded unions of pattern languages from positive data.- A class of prolog programs inferable from positive data.- Vacillatory and BC learning on noisy data.- Transformations that preserve learnability.- Probabilistic limit identification up to “small” sets.- Reflecting inductive inference machines and its improvement by therapy.