Chapter 9. SPMD and MPMD Using Templates and the MPI

"There must be an essentially non-algorithmic ingredient in the action of consciousness."

Roger Penrose , The Emperor's New Mind

In this Chapter

  • Work Breakdown Structure for the MPI

  • Using Template Functions to Represent MPI Tasks

  • Simplifying MPI Communications

  • Summary

Templates support the notion of parameterized programming. The basic idea of parameterized programming is to maximize software reuse by implementing software designs in as general a form as possible. Function templates support generic procedural abstractions and class templates support generic data abstractions. Typically, computer programs are already general solutions to specific problems. A program that adds two numbers is usually designed to add any two numbers. However, if the program only performed the operation of addition, we could generalize this program by allowing it to perform different operations on any two numbers. If we want the most general program, can we stop with simply allowing it to perform different operations on any two numbers? What if the numbers are of different types, that is, complex numbers and float s? We may wish to design the program so that not only can it perform different operations on any two numbers but on different types or classes of numbers (i.e., int s, float s, double s, complex numbers, etc.). In addition, we would like the program to perform any kind of binary operation on any pair of numbers so long as that operation is legal for those two numbers. Once we have implemented such a program, the opportunities for reuse are significant. Function and class templates give this capability to the C++ programmer. This kind of generalization can be accomplished using parameterized programming.

The parameterized programming paradigm supported by C++, combined with the object-oriented paradigm that is also supported by C++, provide some unique approaches to MPI programming. As we discussed in Chapter 1, the MPI (Message Passing Interface) is a standard of communication used in implementing programs that require parallelism. The MPI is implemented as a collection of more than 300 routines. The MPI functions include everything from spawning tasks to barrier synchronization to set operations. There is also a C++ representation for the MPI functions that encapsulate the functionality of the MPI into a set of classes. However, many of the advantages found in the object-oriented paradigm are not used in the MPI library. The advantages of parameterized programming are also absent. So while the MPI has important value as a standard, it does not go a long way to simplify parallel programming. It does insulate the programmer from socket programming and many of the pitfalls of network programming. That insulation is not enough. Cluster, SMP, and MPP application programming can be made easier. The template facilities in C++, and the support for true object-oriented programming, can be used to help us accomplish this goal. In this chapter, we use templates and techniques from object-oriented programming, to simplify the basic SPMD and MPMD approaches used with MPI programming.



Parallel and Distributed Programming Using C++
Parallel and Distributed Programming Using C++
ISBN: 0131013769
EAN: 2147483647
Year: 2002
Pages: 133

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net