Parallel Programming: Techniques and Applications Using Networked Workstations and Parallel Computers: United States Edition
Autor Barry Wilkinson, Michael Allenen Limba Engleză Paperback – 3 mar 2004
This accessible text covers the techniques of parallel programming in a practical manner that enables readers to write and evaluate their parallel programs. Supported by the National Science Foundation and exhaustively class-tested, it is the first text of its kind that does not require access to a special multiprocessor system, concentrating instead on parallel programs that can be executed on networked computers using freely available parallel software tools. The book covers the timely topic of cluster programming, interesting to many programmers due to the recent availability of low-cost computers. Uses MPI pseudocodes to describe algorithms and allows different programming tools to be implemented, and provides readers with thorough coverage of shared memory programming, including Pthreads and OpenMP. Useful as a professional reference for programmers and system administrators.
Preț: 1112.31 lei
Preț vechi: 1308.59 lei
-15% Nou
212.94€ • 221.67$ • 175.30£
Carte indisponibilă temporar
Specificații
ISBN-10: 0131405632
Pagini: 496
Dimensiuni: 178 x 229 x 30 mm
Greutate: 0.79 kg
Ediția:Nouă
Editura: Pearson Education
Colecția Prentice Hall
Locul publicării:Upper Saddle River, United States
Descriere
This nontheoretical, highly accessible text—which is linked to real parallel programming software—covers the techniques of parallel programming in a practical manner that enables students to write and evaluate their parallel programs. Supported by the National Science Foundation and exhaustively class-tested, it is the first text of its kind that does not require access to a special multiprocessor system, concentrating instead only on parallel programs that can be executed on networked workstations using freely available parallel software tools. The Second Edition has been revised to incorporate a greater focus on cluster programming as this type of programming has become more widespread with the availability of low-cost computers.
Cuprins
1. Parallel Computers.
2. Message-Passing Computing.
3. Embarrassingly Parallel Computations.
4. Partitioning and Divide-and-Conquer Strategies.
5. Pipelined Computations.
6. Synchronous Computations.
7. Load Balancing and Termination Detection.
8. Programming with Shared Memory.
9. Distributed Shared Memory Systems and Programming.
II. ALGORITHMS AND APPLICATIONS.
10. Sorting Algorithms.
11. Numerical Algorithms.
12. Image Processing.
13. Searching and Optimization.
Appendix A: Basic MPI Routines.
Appendix B: Basic Pthread Routines.
Appendix C: OpenMP Directives, Library Functions, and Environment Variables
Index.
Caracteristici
- NEW - Chapter on Distributed Shared Memory (DSM) programming—Describes techniques and tools for shared memory programming on clusters.
- Enables programs to be written in shared memory paradigm which has advantages over traditional message passing programming.
- Enables programs to be written in shared memory paradigm which has advantages over traditional message passing programming.
- NEW - Content revisions throughout.
- Provides students with the most current and concise information possible.
- Provides students with the most current and concise information possible.
- NEW - Required software (MPI, PVM, DSM) available FREE!
- Students are provided with all the learning materials necessary for success in the course.
- Students are provided with all the learning materials necessary for success in the course.
- Usage of MPI and PVM pseudocodes.
- Describes algorithms and allows different programming tools to be implemented.
- Describes algorithms and allows different programming tools to be implemented.
- Thorough coverage of shared memory programming and Pthreads.
- Assists student in shared memory programming assignments.
- Assists student in shared memory programming assignments.
- Exploration of such applications as numerical algorithms, image processing and searching and optimization.
Caracteristici noi
- Chapter on Distributed Shared Memory (DSM) programming—Describes techniques and tools for shared memory programming on clusters.
- Enables programs to be written in shared memory paradigm which has advantages over traditional message passing programming.
- Enables programs to be written in shared memory paradigm which has advantages over traditional message passing programming.
- Content revisions throughout.
- Provides students with the most current and concise information possible.
- Provides students with the most current and concise information possible.
- Updated Companion Website—Includes revised step-by-step instructions for students and extensive support materials for instructors such as PowerPoint slides and assignments.
- Provides a resource that complements the text in a variety of ways that will help both students and professors in and out of the classroom.
- Provides a resource that complements the text in a variety of ways that will help both students and professors in and out of the classroom.
- Required software (MPI, PVM, DSM) available FREE!
- Students are provided with all the learning materials necessary for success in the course.
- Students are provided with all the learning materials necessary for success in the course.