Theoretical Advances in Neural Computation and Learning
Editat de Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitskyen Limba Engleză Paperback – 27 sep 2012
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 968.25 lei 6-8 săpt. | |
Springer Us – 27 sep 2012 | 968.25 lei 6-8 săpt. | |
Hardback (1) | 974.51 lei 6-8 săpt. | |
Springer Us – 30 noi 1994 | 974.51 lei 6-8 săpt. |
Preț: 968.25 lei
Preț vechi: 1210.31 lei
-20% Nou
Puncte Express: 1452
Preț estimativ în valută:
185.32€ • 193.15$ • 154.27£
185.32€ • 193.15$ • 154.27£
Carte tipărită la comandă
Livrare economică 07-21 ianuarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9781461361602
ISBN-10: 1461361605
Pagini: 496
Ilustrații: XXIV, 468 p.
Dimensiuni: 155 x 235 x 26 mm
Greutate: 0.69 kg
Ediția:Softcover reprint of the original 1st ed. 1994
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States
ISBN-10: 1461361605
Pagini: 496
Ilustrații: XXIV, 468 p.
Dimensiuni: 155 x 235 x 26 mm
Greutate: 0.69 kg
Ediția:Softcover reprint of the original 1st ed. 1994
Editura: Springer Us
Colecția Springer
Locul publicării:New York, NY, United States
Public țintă
ResearchCuprins
I Computational Complexity of Neural Networks.- 1 Neural Models and Spectral Methods.- 2 Depth-Efficient Threshold Circuits for Arithmetic Functions.- 3 Communication Complexity and Lower Bounds for Threshold Circuits.- 4 A Comparison of the Computational Power of Sigmoid and Boolean Threshold Circuits.- 5 Computing on Analog Neural Nets with Arbitrary Real Weights.- 6 Connectivity Versus Capacity in the Hebb Rule.- II Learning and Neural Networks.- 7 Computational Learning Theory and Neural Networks: A Survey of Selected Topics.- 8 Perspectives of Current Research about the Complexity of Learning on Neural Nets.- 9 Learning an Intersection of K Halfspaces Over a Uniform Distribution.- 10 On the Intractability of Loading Neural Networks.- 11 Learning Boolean Functions via the Fourier Transform.- 12 LMS and Backpropagation are Minimax Filters.- 13 Supervised Learning: can it Escape its Local Minimum?.