Cantitate/Preț
Produs

Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems: Perspectives in Neural Computing

Editat de Amanda J.C. Sharkey
en Limba Engleză Paperback – 22 ian 1999
The past decade could be seen as the heyday of neurocomputing: in which the capabilities of monolithic nets have been well explored and exploited. The question then is where do we go from here? A logical next step is to examine the potential offered by combinations of artificial neural nets, and it is that step that the chapters in this volume represent. Intuitively, it makes sense to look at combining ANNs. Clearly complex biological systems and brains rely on modularity. Similarly the principles of modularity, and of reliability through redundancy, can be found in many disparate areas, from the idea of decision by jury, through to hardware re­ dundancy in aeroplanes, and the advantages of modular design and reuse advocated by object-oriented programmers. And it is not surprising to find that the same principles can be usefully applied in the field of neurocomput­ ing as well, although finding the best way of adapting them is a subject of on-going research.
Citește tot Restrânge

Din seria Perspectives in Neural Computing

Preț: 64779 lei

Preț vechi: 80973 lei
-20% Nou

Puncte Express: 972

Preț estimativ în valută:
12397 12864$ 10362£

Carte tipărită la comandă

Livrare economică 15-29 martie

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781852330040
ISBN-10: 185233004X
Pagini: 316
Ilustrații: XV, 298 p. 6 illus.
Dimensiuni: 155 x 235 x 17 mm
Greutate: 0.48 kg
Ediția:1st Edition.
Editura: SPRINGER LONDON
Colecția Springer
Seria Perspectives in Neural Computing

Locul publicării:London, United Kingdom

Public țintă

Professional/practitioner

Cuprins

1. Multi-Net Systems.- 1.0.1 Different Forms of Multi-Net System.- 1.1 Ensembles.- 1.2 Modular Approaches.- 1.3 The Chapters in this Book.- 1.4 References.- 2. Combining Predictors.- 2.1 Combine and Conquer.- 2.2 Regression.- 2.3 Classification.- 2.4 Remarks.- 2.5 Adaboost and Arcing.- 2.6 Recent Research.- 2.7 Coda.- 2.8 References.- 3. Boosting Using Neural Networks.- 3.1 Introduction.- 3.2 Bagging.- 3.3 Boosting.- 3.4 Other Ensemble Techniques.- 3.5 Neural Networks.- 3.6 Trees.- 3.7 Trees vs. Neural Nets.- 3.8 Experiments.- 3.9 Conclusions.- 3.10 References.- 4. A Genetic Algorithm Approach for Creating Neural Network Ensembles.- 4.1 Introduction.- 4.2 Neural Network Ensembles.- 4.3 The ADDEMUP Algorithm.- 4.4 Experimental Study.- 4.5 Discussion and Future Work.- 4.6 Additional Related Work.- 4.7 Conclusions.- 4.8 References.- 5. Treating Harmful Collinearity in Neural Network Ensembles.- 5.1 Introduction.- 5.2 Overview of Optimal Linear Combinations (OLC) of Neural Networks.- 5.3 Effects of Collinearity on Combining Neural Networks.- 5.4 Improving the Generalisation of NN Ensembles by Treating Harmful Collinearity.- 5.5 Experimental Results.- 5.6 Concluding Remarks.- 5.7 References.- 6. Linear and Order Statistics Combiners for Pattern Classification.- 6.1 Introduction.- 6.2 Class Boundary Analysis and Error Regions.- 6.3 Linear Combining.- 6.4 Order Statistics.- 6.5 Correlated Classifier Combining.- 6.6 Experimental Combining Results.- 6.7 Discussion.- 6.8 References.- 7. Variance Reduction via Noise and Bias Constraints.- 7.1 Introduction.- 7.2 Theoretical Considerations.- 7.3 The BootstrapEnsemble with Noise Algorithm.- 7.4 Results on the Two—Spirals Problem.- 7.5 Discussion.- 7.6 References.- 8. A Comparison of Visual Cue Combination Models.- 8.1Introduction.- 8.2 Stimulus.- 8.3 Tasks.- 8.4 Models of Cue Combination.- 8.5 Simulation Results.- 8.6 Summary.- 8.7 References.- 9. Model Selection of Combined Neural Nets for Speech Recognition.- 9.1 Introduction.- 9.2 The Acoustic Mapping.- 9.3 Network Architectures.- 9.4 Experimental Environment.- 9.5 Bootstrap Estimates and Model Selection.- 9.6 Normalisation Results.- 9.7 Continuous Digit Recognition Over the Telephone Network.- 9.8 Conclusions.- 9.9 References.- 10. Self-Organised Modular Neural Networks for Encoding Data.- 10.1 Introduction.- 10.2 Basic Theoretical Framework.- 10.3 Circular Manifold.- 10.4 Toroidal Manifold: Factorial Encoding.- 10.5 Asymptotic Results.- 10.6 Approximate the Posterior Probability.- 10.7 Joint Versus Factorial Encoding.- 10.8 Conclusions.- 10.9 References.- 11. Mixtures of X.- 11.1 Introduction.- 11.2 Mixtures of X.- 11.3 Summary.- 11.4 References.

Caracteristici

There are no other books covering both modular and ensemble approaches (The ensemble approach uses a variety of methods to create a set of different nets trained on the same task; the modular approach decomposes a task into simpler problems) The presentation of techniques is accompanied by analysis and evaluation of their relative effectiveness on a variety of problems The book focuses on the combination of neural nets, but many of the methods are applicable to a wider variety of statistical methods