The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar
Autor Partha Niyogien Limba Engleză Hardback – 30 noi 1997
These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap.
The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn? - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.
The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 624.88 lei 6-8 săpt. | |
Springer Us – 16 oct 2012 | 624.88 lei 6-8 săpt. | |
Hardback (1) | 632.91 lei 6-8 săpt. | |
Springer Us – 30 noi 1997 | 632.91 lei 6-8 săpt. |
Preț: 632.91 lei
Preț vechi: 791.13 lei
-20% Nou
Puncte Express: 949
Preț estimativ în valută:
121.14€ • 126.26$ • 100.84£
121.14€ • 126.26$ • 100.84£
Carte tipărită la comandă
Livrare economică 04-18 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9780792380818
ISBN-10: 0792380819
Pagini: 224
Ilustrații: XXIII, 224 p.
Dimensiuni: 155 x 235 x 20 mm
Greutate: 0.59 kg
Ediția:1998
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States
ISBN-10: 0792380819
Pagini: 224
Ilustrații: XXIII, 224 p.
Dimensiuni: 155 x 235 x 20 mm
Greutate: 0.59 kg
Ediția:1998
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States
Public țintă
ResearchCuprins
1. Introduction.- 1.1 The Components of a Learning Paradigm.- 1.2 Parametric Hypothesis Spaces.- 1.3 Technical Contents and Major Contributions.- 2. Generalization Error For Neural Nets.- 2.1 Introduction.- 2.2 Definitions and Statement of the Problem.- 2.3 Stating the Problem for Radial Basis Functions.- 2.4 Main Result.- 2.5 Remarks.- 2.6 Implications of the Theorem in Practice: Putting In the Numbers.- 2.7 Conclusion.- 2-A Notations.- 2-B A Useful Decomposition of the Expected Risk.- 2-C A Useful Inequality.- 2-D Proof of the Main Theorem.- 3. Active Learning.- 3.1 A General Framework For Active Approximation.- 3.2 Example 1: A Class of Monotonically Increasing Bounded Functions.- 3.3 Example 2: A Class of Functions with Bounded First Derivative.- 3.4 Conclusions, Extensions, and Open Problems.- 3.5 A Simple Example.- 3.6 Generalizations.- 4. Language Learning.- 4.1 Language Learning and The Poverty of Stimulus.- 4.2 Constrained Grammars-Principles and Parameters.- 4.3 Learning in the Principles and Parameters Framework.- 4.4 Formal Analysis of the Triggering Learning Algorithm.- 4.5 Characterizing Convergence Times for the Markov Chain Model.- 4.6 Exploring Other Points.- 4.7 Batch Learning Upper and Lower Bounds: An Aside.- 4.8 Conclusions, Open Questions, and Future Directions.- 4-A Unembedded Sentences For Parametric Grammars.- 4-B Memoryless Algorithms and Markov Chains.- 4-C Proof of Learnability Theorem.- 4-D Formal Proof.- 5. Language Change.- 5.1 Introduction.- 5.2 Language Change in Parametric Systems.- 5.3 Example 1: A Three Parameter System.- 5.4 Example 2: The Case of Modern French:.- 5.5 Conclusions.- 6. Conclusions.- 6.1 Emergent Themes.- 6.2 Extensions.- 6.3 A Concluding Note.- References.