Communication and Memory Requirements as the Basis for Mapping Task and Data Parallel Programs - Robotics Institute Carnegie Mellon University

Communication and Memory Requirements as the Basis for Mapping Task and Data Parallel Programs

J. Subhlok and D. R. O'Hallaron
Conference Paper, Proceedings of ACM/IEEE Conference on Supercomputing (Supercomputing '94), pp. 330 - 339, November, 1994

Abstract

For a wide variety of applications, both task and data parallelism must be exploited to achieve the best possible performance on a multicomputer. Recent research has underlined the importance of exploiting task and data parallelism in a single compiler framework, and such a compiler can map a single source program in many different ways onto a parallel machine. The tradeoffs between task and data parallelism are complex and depend on the characteristics of the program to be executed, most significantly the memory and communication requirements, and the performance parameters of the target parallel machine. In this paper, we present a framework to isolate and examine the specific characteristics of programs that determine the performance for different mappings. Our focus is on applications that process a stream of input, and whose computation structure is fairly static and predictable. We describe three such applications that were developed with our compiler: fast Fourier transforms, narrowband tracking radar, and multibaseline stereo. We examine the tradeoffs between various mappings for them and show how the framework is used to obtain efficient mappings.

BibTeX

@conference{Subhlok-1994-16029,
author = {J. Subhlok and D. R. O'Hallaron},
title = {Communication and Memory Requirements as the Basis for Mapping Task and Data Parallel Programs},
booktitle = {Proceedings of ACM/IEEE Conference on Supercomputing (Supercomputing '94)},
year = {1994},
month = {November},
pages = {330 - 339},
address = {Washington, DC},
}