2.4 Too Much Parallelization or Distribution Can Have Negative Consequences

2.4 Too Much Parallelization or Distribution Can Have Negative Consequences

There is a point where the overhead in managing multiple processors outweighs the speedup and other advantages gained from parallelization. The old adage "you can never have enough processors" is simply not true. Communication between computers or synchronization between processors comes at a cost. The complexity of the synchronization or the amount of communication between processors can require so much computation that the performance of the tasks that are doing the work can be negatively impacted. How many processes, tasks , or threads should a program be divided into? Is there an optimal number of processors for any given parallel program? At what point does adding more processors or computers to the computation pool slow things down instead of speeding them up? It turns out that the numbers change depending on the program. Some scientific simulations may max out at several thousand processors, while for some business applications several hundred might be sufficient. For some client-server configurations, eight processors are optimal and nine processors would cause the server to perform poorly.

There is the work and resources involved in managing parallel hardware and the work involved in managing concurrently executing processes and threads in software. The limit of software processes might be reached before we've reached the optimum number of processors or computers. Likewise, we might see diminishing returns in the hardware before we've reached the optimum number of concurrently executing tasks.



Parallel and Distributed Programming Using C++
Parallel and Distributed Programming Using C++
ISBN: 0131013769
EAN: 2147483647
Year: 2002
Pages: 133

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net