Cantitate/Preț
Produs

Dealing with Complexity: A Neural Networks Approach: Perspectives in Neural Computing

Editat de Mirek Karny, Kevin Warwick, Vera Kurkova
en Limba Engleză Paperback – 7 noi 1997
In almost all areas of science and engineering, the use of computers and microcomputers has, in recent years, transformed entire subject areas. What was not even considered possible a decade or two ago is now not only possible but is also part of everyday practice. As a result, a new approach usually needs to be taken (in order) to get the best out of a situation. What is required is now a computer's eye view of the world. However, all is not rosy in this new world. Humans tend to think in two or three dimensions at most, whereas computers can, without complaint, work in n­ dimensions, where n, in practice, gets bigger and bigger each year. As a result of this, more complex problem solutions are being attempted, whether or not the problems themselves are inherently complex. If information is available, it might as well be used, but what can be done with it? Straightforward, traditional computational solutions to this new problem of complexity can, and usually do, produce very unsatisfactory, unreliable and even unworkable results. Recently however, artificial neural networks, which have been found to be very versatile and powerful when dealing with difficulties such as nonlinearities, multivariate systems and high data content, have shown their strengths in general in dealing with complex problems. This volume brings together a collection of top researchers from around the world, in the field of artificial neural networks.
Citește tot Restrânge

Din seria Perspectives in Neural Computing

Preț: 62986 lei

Preț vechi: 78733 lei
-20% Nou

Puncte Express: 945

Preț estimativ în valută:
12055 12717$ 10046£

Carte tipărită la comandă

Livrare economică 03-17 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783540761600
ISBN-10: 3540761608
Pagini: 328
Ilustrații: XV, 308 p. 5 illus.
Dimensiuni: 155 x 235 x 17 mm
Greutate: 0.5 kg
Ediția:1st Edition.
Editura: SPRINGER LONDON
Colecția Springer
Seria Perspectives in Neural Computing

Locul publicării:London, United Kingdom

Public țintă

Research

Cuprins

1 Recurrent Neural Networks: Some Systems-Theoretic Aspects.- 1 Introduction.- 2 System-Theory Results: Statements.- 3 System-Theory Results: Discussion.- 4 Computational Power.- 5 Some Remarks.- 2 The Use of State Space Control Theory for Analysing Feedforward Neural Networks.- 1 Introduction.- 2 State Space Theory.- 3 State Space Representation of Feedforward Neural Networks.- 4 Observability of Feedforward Neural Networks.- 5 Controllability.- 6 Stability.- 7 Discussion.- 8 Appendix: Linear Systems of Equations [7].- 3 Statistical Decision Making and Neural Networks.- 1 Introduction.- 2 Statistical Decision Making.- 3 Bayesian Learning.- 4 On Ingredients of Bayesian Learning.- 5 Interlude on Gaussian Linear Regression Model.- 6 Approximate On-Line Estimation.- 7 Conclusions.- 4 A Tutorial on the EM Algorithm and its Applications to Neural Network Learning.- 1 Introduction.- 2 The EM Algorithm.- 3 Practical Applications.- 4 Convergence Properties.- 5 Concluding Remarks.- 5 On the Effectiveness of Memory-Based Methods inMachine Learning.- 1 Introduction.- 2 Background.- 3 The Curse of Dimensionality.- 4 The Barron-Jones Theory.- 5 Experimental Results.- 6 Analysis of Memory-Based Methods.- 7 Discussion.- 6 A Study of Non Mean Square Error Criteria for the Training of Neural Networks.- 1 Introduction.- 2 Statement of the Problem.- 3 Cost Function Minimisation for ? = E(y/x).- 4 Cost Function Minimisation for the Median of p(y/x).- 5 Simulation Results.- 6 Conclusion.- 7 A Priori Information in Network Design.- 1 Introduction.- 2 Preliminaries.- 3 Recurrent Networks and Relative Order.- 4 Simulations.- 5 Conclusions.- 8 Neurofuzzy Systems Modelling: A Transparent Approach.- 1 Empirical Data Modelling.- 2 Neurofuzzy Construction Algorithms.- 3 Modelling Case Studies.- 4Conclusions.- 9 Feature Selection and Classification by a Modified Model with Latent Structure.- 1 Introduction.- 2 Modified Model with Latent Structure.- 3 Optimizing Model Parameters.- 4 Approach to Feature Selection.- 5 Pseudo-Bayes Decision Rule.- 6 Experiments.- 7 Summary and Conclusion.- 10 Geometric Algebra Based Neural Networks.- 1 Introduction.- 2 Complex-Valued Neural Networks.- 3 Comments on the Applicability of CVNNs to n-Dimensional Signals.- 4 Generalisations of CVNNs Within a GA Framework.- 5 Summary.- 11 Discrete Event Complex Systems: Scheduling with Neural Networks.- 1 Introduction.- 2 The DNN Architecture.- 3 Continuous Time Control Law.- 4 Real-Time Scheduling.- 5 Simulation Results.- 6 Summary.- 12 Incremental Approximation by Neural Networks.- 1 Introduction.- 2 Approximation of Functions by One-Hidden-Layer Networks.- 3 Rates of Approximation of Incremental Approximants.- 4 Variation with Respect to a Set of Functions.- 5 Incremental Approximation by Perceptron and RBF Networks.- 6 Discussion.- 13 Approximation of Smooth Functions by Neural Networks.- 1 Introduction.- 2 Preliminaries.- 3 Complexity Theorems.- 4 Local Approximation.- 5 Some Open Problems.- 14 Rates of Approximation in a Feedforward Network Depend on the Type of Computational Unit.- 1 Introduction.- 2 Feedforward Networks with Various Computational Units.- 3 Discussion.- 15 Recent Results and Mathematical Methods for Functional Approximation by Neural Networks.- 1 Introduction.- 2 Individual vs Variable Context.- 3 Nonlinear Approximation.- 4 Feedforward Architectures.- 5 Lower Bounds on Rate of Approximation.- 6 Uniqueness of Approximation by Neural Networks.- 7 Other Approaches.- 16 Differential Neurocontrol of Multidimensional Systems.- 1 Introduction.- 2 Neurophysiological Basis.- 3 Scheme of the Differential Neurocontroller.- 4 Multiplicative Units.- 5 Feedback Block.- 6 Feedforward Block.- 7 Convergence of Learning.- 8 Computer Simulations.- 9 Conclusions.- 17 The Psychological Limits of Neural Computation.- 1 Neural Networks and Turing Machines.- 2 Function Approximation.- 3 Representation of Logical Functions Using Neural Networks.- 4 The Complexity of Learning in Neural Networks.- 5 Learning Logical Functions.- 6 The Optimization of Circuits.- 7 Final Remarks.- 18 A Brain-Like Design to Learn Optimal Decision Strategies in Complex Environments.- 1 Introduction.- 2 Time-Chunked Approximate Dynamic Programming.- 3 Temporal Chunking with Neural Networks.- 4 Spatial Chunking and Critical Subsystems.- 5 Adding the Third Brain.- Research Acknowledgements.