Recurrent Neural Networks: From Simple to Gated Architectures
Autor Fathi M. Salemen Limba Engleză Hardback – 4 ian 2022
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 371.85 lei 6-8 săpt. | |
Springer International Publishing – 5 ian 2023 | 371.85 lei 6-8 săpt. | |
Hardback (1) | 411.37 lei 6-8 săpt. | |
Springer International Publishing – 4 ian 2022 | 411.37 lei 6-8 săpt. |
Preț: 411.37 lei
Nou
Puncte Express: 617
Preț estimativ în valută:
78.75€ • 81.86$ • 65.29£
78.75€ • 81.86$ • 65.29£
Carte tipărită la comandă
Livrare economică 06-20 februarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783030899288
ISBN-10: 3030899284
Pagini: 200
Ilustrații: XX, 121 p. 26 illus., 24 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.38 kg
Ediția:1st ed. 2022
Editura: Springer International Publishing
Colecția Springer
Locul publicării:Cham, Switzerland
ISBN-10: 3030899284
Pagini: 200
Ilustrații: XX, 121 p. 26 illus., 24 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.38 kg
Ediția:1st ed. 2022
Editura: Springer International Publishing
Colecția Springer
Locul publicării:Cham, Switzerland
Cuprins
Introduction.- 1. Network Architectures.- 2. Learning Processes.- 3. Recurrent Neural Networks (RNN).- 4. Gated RNN: The Long Short-Term Memory (LSTM) RNN.- 5. Gated RNN: The Gated Recurrent Unit (GRU) RNN.- 6. Gated RNN: The Minimal Gated Unit (MGU) RNN.
Notă biografică
Dr. Salem’s current research interests include: Neural Networks and Learning Systems, Blind Signal Deconvolution and Extraction, Dynamical Systems and Chaos, Integrated CMOS Sensing and Processing. He was the Chairman of the IEEE Technical Committee on Real-Time Control Computing and Signal Processing (1994–1996). He was the Chairman of the CAS Technical Committee on Neural Systems and Their Applications (1997–1998). He served on the IEEE Neural Network Council (1999–2000), and was the first Vice President of the IEEE Neural Network Council for Technical Activities (1999–2001). He was the Guest Co-Editor of the IEEE-CAS Special Issue on Bifurcations and Chaos in Circuits and Systems July 1988 (with T. Matsumoto), the Special Issue on Micro-Electronic Hardware Implementation of Soft Computing: Neural and Fuzzy Networks with Learning, Journal of Computers and Electrical Engineering, July 1999 (with T. Yamakawa), and the Special Issue on Digital and Analog Arrays, in the Journal of Circuits, Systems, and Computers, August 1999 (with M. Ahmadi). He was the recipient of the IEEE CAS Golden Jubilee Award (1999), the IEEE Third Millennium Award (2000), and The CAS Darlington Best Paper Award (2001).With a team of students, he also received the U.S. Semiconductor Research Corporation (SRC) Phase II Finalist Award (2000). He was a Distinguished Lecturer of the IEEE CAS Society in 2000–2001. He was an Associate Editor and Guest Editor for numerous IEEE and other transactions including the IEEE Circuits and Systems, IEEE Neural Networks, the Journal of Circuits, Systems, and Computers, and the Journal of Computer and Electrical Engineering. He was the Chairman of the Engineering Foundation Conference on Qualitative Methods for Nonlinear Dynamics. He served in several capacities in several conferences including the General Chair of the IEEE Midwest Symposium on Circuits and Systems in Lansing, MI, in 2000 and also in 2021.
He was a Visiting Professor at UC, Berkeley (1983), the California Institute of Technology, Pasadena (1992), and the University of Minnesota, Twin Cities (1993). He joined MSU in 1985 and has been a Professor since1991. He has worked and consulted for several companies including General Motors, Ford, Smith’s Industries, Intersignal, IC Tech Inc., and Clarity LLC. He has authored more than 250 technical papers, and co-edited the textbook (Dynamical Systems Approaches to Nonlinear Problems in Circuits and Systems, (SIAM, 1988). He is a co-inventor of more than 14 patents on adaptive nonlinear signal processing, neural networks, and sensors.
He was a Visiting Professor at UC, Berkeley (1983), the California Institute of Technology, Pasadena (1992), and the University of Minnesota, Twin Cities (1993). He joined MSU in 1985 and has been a Professor since1991. He has worked and consulted for several companies including General Motors, Ford, Smith’s Industries, Intersignal, IC Tech Inc., and Clarity LLC. He has authored more than 250 technical papers, and co-edited the textbook (Dynamical Systems Approaches to Nonlinear Problems in Circuits and Systems, (SIAM, 1988). He is a co-inventor of more than 14 patents on adaptive nonlinear signal processing, neural networks, and sensors.
Textul de pe ultima copertă
This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a treatment of the general recurrent neural networks with principled methods for training that render the (generalized) backpropagation through time (BPTT). This author focuses on the basics and nuances of recurrent neural networks, providing technical and principled treatment of the subject, with a view toward using coding and deep learning computational frameworks, e.g., Python and Tensorflow-Keras. Recurrent neural networks are treated holistically from simple to gated architectures, adopting the technical machinery of adaptive non-convex optimization with dynamic constraints to leverage its systematic power in organizing the learning and training processes. This permits the flow of concepts and techniques that provide grounded support for design and training choices. The author’s approach enables strategic co-training ofoutput layers, using supervised learning, and hidden layers, using unsupervised learning, to generate more efficient internal representations and accuracy performance. As a result, readers will be enabled to create designs tailoring proficient procedures for recurrent neural networks in their targeted applications.
- Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks;
- Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior;
- Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints.
Caracteristici
Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints