Efficient parallel algorithms can be made robust

Paris C. Kanellakis, Alex A. Shvartsman

Research output: Contribution to journalArticlepeer-review

Abstract

The efficient parallel algorithms proposed for many fundamental problems, such as list ranking, integer sorting and computing preorder numberings on trees, are very sensitive to processor failures. The requirement of efficiency (commonly formalized using Parallel-timexProcessors as a cost measure) has led to the design of highly tuned PRAM algorithms which, given the additional constraint of simple processor failures, unfortunately become inefficient or even incorrect. We propose a new notion of robustness, that combines efficiency with fault tolerance. For the common case of fail-stop errors, we develop a general and easy to implement technique to make robust many efficient parallel algorithms, e.g., algorithms for all the problems listed above. More specifically, for any dynamic pattern of fail-stop errors on a CRCW PRAM with at least one surviving processor, our method increases the original algorithm cost by at most a log2 multiplicative factor. Our technique is based on a robust solution of the problem of Write-All, i.e., using P processors, write 1's in all locations of an N-sized array. In addition we show that at least a log/log log multiplicative overhead will be incurred for certain patterns of failures by any algorithm that implements robust solutions to Write-All with P=N. However, by exploiting parallel slackness, we obtain an optimal cost algorithm when {Mathematical expression}

Original languageEnglish (US)
Pages (from-to)201-217
Number of pages17
JournalDistributed Computing
Volume5
Issue number4
DOIs
StatePublished - Apr 1992
Externally publishedYes

Keywords

  • Computation complexity
  • Fault tolerance
  • Lower bounds
  • Parallel computation
  • Robust algorithms

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Hardware and Architecture
  • Computer Networks and Communications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Efficient parallel algorithms can be made robust'. Together they form a unique fingerprint.

Cite this