Cantitate/Preț
Produs

ICANN ’94: Proceedings of the International Conference on Artificial Neural Networks Sorrento, Italy, 26–29 May 1994 Volume 1, Parts 1 and 2

Editat de Maria Marinaro, Pietro G. Morasso
en Limba Engleză Paperback – 19 mai 1994
From its early beginnings in the fifties and sixties the field of neural networks has been steadily growing. The first wave was driven by a handful of pioneers who first discovered analogies between machines and biological systems in communication, control and computing. Technological constraints held back research considerably, but gradually computers have become less expensive and more accessible and software tools inceasingly more powerful. Mathematical techniques, developed by computer-aware people, have steadily accumulated and the second wave has begun. Researchers from such diverse areas as psychology, mathematics, physics, neuroscience and engineering now work together in the neural networking field.
Citește tot Restrânge

Preț: 34465 lei

Preț vechi: 43082 lei
-20% Nou

Puncte Express: 517

Preț estimativ în valută:
6596 6959$ 5497£

Carte tipărită la comandă

Livrare economică 02-16 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783540198871
ISBN-10: 3540198873
Pagini: 820
Ilustrații: XXIX, 1488 p. 34 illus.
Dimensiuni: 155 x 235 x 43 mm
Greutate: 1.13 kg
Ediția:1st Edition.
Editura: SPRINGER LONDON
Colecția Springer
Locul publicării:London, United Kingdom

Public țintă

Research

Descriere

From its early beginnings in the fifties and sixties the field of neural networks has been steadily growing. The first wave was driven by a handful of pioneers who first discovered analogies between machines and biological systems in communication, control and computing. Technological constraints held back research considerably, but gradually computers have become less expensive and more accessible and software tools inceasingly more powerful. Mathematical techniques, developed by computer-aware people, have steadily accumulated and the second wave has begun. Researchers from such diverse areas as psychology, mathematics, physics, neuroscience and engineering now work together in the neural networking field.

Cuprins

Contents, Volume 1.- 1 • Neurobiology.- Why bright Kanizsa squares look closer: consistency of segmentations and surfaces in 3-D vision.- Spatial pooling and perceptual framing by synchronizing cortical dynamics.- Vertebrate retina: sub-sampling and aliasing effects can explain colour-opponent and colour constancy phenomena.- RETINA: a model of visual information processing in the retinal neural network.- The influence of the inhomogeneous dendritic field size of the retinal ganglion cells on the fixation.- Top-down interference in visual perception.- Dynamic vision system: modeling the prey recognition of common toads Bufo bufo.- Emergence of long range order in maps of orientation preference.- Oriented ocular dominance bands in the self-organizing feature map.- How to use non-visual information for optic flow processing in monkey visual cortical area MSTd.- A learning rule for self-organization of the velocity selectivity of directionally selective cells.- Motion analysis with recurrent neural nets.- Self-organizing a behaviour-oriented interpretation of objects in active-vision.- Hybrid methods for robust irradiance analysis and 3-D shape reconstruction from images.- A parallel algorithm for simulating color perception.- Positional competition in the BCS.- A computational model for texton-based pre attentive texture segmentation.- Hopfield neural network for motion estimation and interpretation.- Phase interactions between place cells during movement.- Self-organization of an equilibrium-point motor controller.- Study of a Purkinje unit as a basic oscillator of the cerebellar cortex.- Compartmental interaction in the granular layer of the cerebellum.- Modeling biologically relevant temporal patterns.- A model of the baroreceptor reflex neural network.- Modelization of vestibulo-ocular reflex (VOR) and motion sickness prediction.- Kernel correlations of movements in neural network.- Analysis of the golf swing from weight-shift using neural networks.- Dry electrophysiology: an approach to the internal representation of brain functions through artificial neural networks.- ANNs and MAMFs: transparency or opacity?.- Collective brain as dynamical system.- Temporal pattern dependent spatial-distribution of LTP in the hippocampal CAl area studied by an optical imaging method.- Synchronization-based complex model neurons.- Synchronization of integrate-and-fire neurons with delayed inhibitory lateral connections.- Complex patterns of oscillations in a neural network model with activity-dependent outgrowth.- Learning and the thalamic-NRT-cortex system.- Resetting the periodic activity of Hydra at a fixed phase.- Integral equations in compartmental model neurodynamics.- Hysteresis in a two neuron-network: basic characteristics and physiological implications.- Cooperation within networks of cortical automata based networks.- Anisotropic correlation properties in the spatial structure of cortical orientation maps.- 2 • Mathematical Model.- Application of neural network and fuzzy logic in modelling and control of fermentation processes.- Neural networks for the processing of fuzzy sets.- Human sign recognition using fuzzy associative inference system.- Bayesian properties and performances of adaptive fuzzy systems in pattern recognition problems.- The representation of human judgement by using fuzzy techniques.- Fuzzy logic versus neural network technique in an identification problem.- Phoneme recognition with hierarchical self organised neural networks and fuzzy systems - A Case Study.- Neuronal network models of the mind.- The consciousness of a neural state machine.- Forward reasoning and Caianiello’s nets.- An ANN model of anaphora: implications for nativism.- The spatter code for encoding concepts at many levels.- Learning in hybrid neural models.- A connectionist model for context effects in the picture-word interference task.- Inductive inference with recurrent radial basis function networks.- Neural networks as a paradigm for knowledge elicitation.- Unsupervised detection of driving states with hierarchical self organizing maps.- Using simulated annealing to train relaxation labeling processes.- A neural model for the execution of symbolic motor programs.- Evolution of typed expressions describing artificial nervous systems.- BAR: a connectionist model of bilingual access representations.- An architecture for image understanding by symbol and pattern integration.- Encoding conceptual graphs by labeling RAAM.- Hybrid system for ship detection in radar images.- Using ART2 and BP co-operatively to classify musical sequences.- Forecasting using constrained neural networks.- The evaluations of environmental impact: cooperative systems.- What generalizations of the self-organizing map make sense?.- A novel approach to measure the topology preservation of feature maps.- Self-Organized learning of 3 dimensions.- A model of fast and reversible representational plasticity using Kohonen mapping.- Multiple self-organizing neural networks with the reduced input dimension.- Adaptive modulation of receptive fields in self-organizing networks.- About the convergence of the generalized Kohonen algorithm.- Reordering transitions in self-organized feature maps with short-range neighbourhood.- Speeding-up self-organizing maps: the quick reaction.- Dynamic extensions of self-organizing maps.- Feature selection with self-Organizing feature map.- Unification of complementary feature map models.- Considerations of geometrical and fractal dimension of SOM to get better learning results.- On the ordering conditions for self-organising maps.- Representation and identification of fault conditions of an anaesthesia system by means of the self-organizing map.- Sensor arrays and self-organizing maps for odour analysis in artificial olfactory systems.- A self-organising neural network for the travelling salesman problem that is competitive with simulated annealing.- Learning attractors as a stochastic process.- Nominal color coding of classified images by Hopfield networks.- Does terminal attractor backpropagation guarantee global optimization?.- Learning and retrieval in attractor neural networks with noise above saturation.- A method of teaching a neural network to generate vector fields for a given attractor.- Recurrent neural networks with delays.- Eman: equivalent mass attraction network.- Analysis of an unsupervised indirect feedback network.- Hopfield energy of random nets.- Multiple cueing of an associative net.- Programmable mixed implementation of the Boltzmann machine.- The influence of response functions in analogue attractor neural networks.- Improvement of learning in recurrent networks by substituting the sigmoid activation function.- Attractor properties of recurrent networks with generalising boolean nodes.- On a class of Hopfield type neural networks for associative memory.- Storage capacity of associative random neural networks.- A generalized bidirectional associative memory with a hidden orthogonal layer.- Finding correspondences between smoothly deformable contours by means of an elastic neural network.- Minimization of number of connections in feedback networks.- An efficient method of pattern storage in the hopfield net.- Recursive learning in recurrent neural networks with varying architecture.- Pruning in recurrent neural networks.- Making hard problems linearly separable - Incremental radial basis function approaches.- ’Partition of unity’ RBF networks are universal function approximators.- Optimal local estimation of RBF parameters.- Acceleration of Gaussian radial basis function networks for function-approximation.- Uniqueness of functional representations by Gaussian basis function networks.- A dynamic mixture of Gaussians neural network for sequence classification.- Hierarchical mixtures of experts and the EM algorithm.- Outline of a linear neural network and applications.- Numerical experiments on the information criteria for layered feedforward neural nets.- Quantifying a critical training set size for generalization and overfitting using teacher neural networks.- Formal representation of neural networks.- Training-dependent measurement.- Genetic algorithms as optimisers for feedforward neural networks.- Selecting a critical subset of given examples during learning.- On the circuit complexity of feedforward neural networks.- Avoiding local minima by a classical range expansion algorithm.- Learning time series by neural networks.- The error absorption for fitting an under-fitting (skeleton) net.- Fast backpropagation using modified sigmoidal functions.- Input contribution analysis in a double input layered neural network.- A unified approach to derive gradient algorithms for arbitrary neural network structures.- Interpretation of BP-trained net outputs.- Fluctuated-threshold effect in multilayered neural network.- On the properties of error functions that affect the speed of backpropagation learning.- Neural network optimization for good generalization performance.- Block-recursive least squares technique for training multilayer perceptrons.- Neural networks for iterative computation of inverse functions.- Cascade correlation convergence theorem.- Optimal weight Initialization for neural networks.- Neural nets with superlinear VC-dimension.- A randomised distributed primer for the updating control on anonymous ANNs.- Catastrophic interference in learning processes by neural networks.- Systematicity in IH-analysis.- Integrating distance measure and inner product neurons.- Teaching by showing in Kendama based on optimization principle.- From coarse to fine: a novel way to train neural networks.- Learning the activation function for the neurons in neural networks.- Projection learning and graceful degradation.- Learning with zero error in feedforward neural networks.- Robustness of Hebbian and anti-Hebbian learning.- Computational experiences of new direct methods for the on-line training of MLP-networks with binary outputs.- Optimising local Hebbian learning: use the o-rule.- Efficient neural net ?-?-evaluators.- A parallel algorithm for a dynamic eta/ alpha estimation in backpropagation learning.- Dynamic pattern selection: effectively training backpropagation neural networks.- A learning rule which implicitly stores training history in weights.- A comparison study of unbounded and real-valued reinforcement associative reward-penalty algorithms.- To swing up an inverted pendulum using stochastic real-valued reinforcement learning.- Efficient reinforcement learning strategies for the pole balancing problem.- Reinforcement learning in Kohonen feature maps.- CMAC manipulator control using a reinforcement learned trajectory planner.- A fast reinforcement learning paradigm with application to CMAC control systems.- Information geometry and the EM algorithm.- SSM: a statistical stepwise method for weight elimination.- Computing the probability density in connectionist regression.- Estimation of conditional densities: a comparison of neural network approaches.- Regularizing stochastic Pott neural networks by penalizing mutual information.- Least mean squares learning algorithm in self referential linear stochastic models.- An approximation network with maximal transinformation.- Extended functionality for probabilistic RAM neurons.- Statistical biases in backpropagation learning.- An approximation of nonlinear canonical correlation analysis by multilayer perceptrons.- Information minimization to improve generalization performance.- Learning and interpretation of weights in neural networks.- Variable selection with optimal cell damage.- Comparison of constructive algorithms for neural networks.- Task decomposition and correlations in growing artificial Neural networks.- XNeuroGene: a system for evolving artificial neural networks.- Incremental training strategies.- Modular object-oriented neural network simulators and topology generalizations.- Gradient-based adaptation of Network Structure.- A connectionist model using multiplexed oscillations and synchrony to enable dynamic connections.- Some results on correlation dimension of time series generated by a network of phase oscillators.- Towards the application of networks with synchronized oscillatory dynamics in vision.- New impulse neuron circuit for oscillatory neural networks.- Adaptive topologically distributed encoding.- On-line learning with momentum for nonlinear learning rules.- Constructive neural network algorithm for approximation of multivariable function with compact support.- A Hebb-like learning rule for cell assemblies formation.- CARVE - a constructive algorithm for real valued examples.- A supervised learning rule for the single spike model.- Comparative bibliography of ontogenic neural networks.- Controlled growth of cascade correlation nets.