Abstract | ||
---|---|---|
In order to exploit the flexibility of OpenMP in parallelizing large scale multi-physics applications where different modes of parallelism are needed for efficient computation, it is first necessary to be able to scale OpenMP codes as well as MPI on large core counts. In this research we have implemented fine grained OpenMP parallelism for a large CFD code GenIDLEST and investigated the performance from 1 to 256 cores using a variety of performance optimization and measurement tools. It is shown through weak and strong scaling studies that OpenMP performance can be made to match that of MPI on the SGI Altix systems for up to 256 cores. Data placement and locality were established to be key components in obtaining good scalability with OpenMP. It is also shown that a hybrid implementation on a dual core system gives the same performance as standalone MPI or OpenMP. Finally, it is shown that in irregular multi-physics applications which do not adhere solely to the SPMD (Single Process, Multiple Data) mode of computation, as encountered in tightly coupled fluid-particulate systems, the flexibility of OpenMP can have a big performance advantage over MPI. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1016/j.parco.2012.05.005 | Parallel Computing |
Keywords | Field | DocType |
fine grained openmp parallelism,large scale multi-physics application,large cfd code,dual core system,openmp code,performance optimization,large core count,fluid-particulate system,openmp performance,standalone mpi,big performance advantage,mpi | SPMD,Locality,Multiple data,Computer science,Parallel computing,Exploit,Computational science,Computational fluid dynamics,Scaling,Scalability,Computation | Journal |
Volume | Issue | ISSN |
38 | 9 | 0167-8191 |
Citations | PageRank | References |
8 | 1.00 | 19 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Amit Amritkar | 1 | 21 | 3.22 |
Danesh Tafti | 2 | 24 | 2.64 |
Rui Liu | 3 | 13 | 2.24 |
Rick Kufrin | 4 | 17 | 1.80 |
Barbara Chapman | 5 | 163 | 14.63 |