Parallel Structures

All computing systems are constructed from interconnected components, and depending on the level of abstraction at which a system is viewed, these components could be transistors, gates, registers, arithmetic units, memories, or complete processors. At all levels of abstraction there are two fundamental ways in which components can be composed to create parallel computing structures, one involving temporal parallelism and the other spatial parallelism.

Temporal parallelism involves partitioning the processing task into a number of steps, which when applied sequentially to each unit of information (x1 - xm in the Temporal Parallelism diagram), produce the same results as the original task. In other words, the task is partitioned in time, with each step of the task being applied to a separate unit of information. A typical example is "assembly line" manufacturing. The application of temporal parallelism in computing produces pipelined structures. The time taken to produce any one result using a pipeline is in practice slightly longer than in a non-pipelined system but the rate at which results are produced is increased in proportion to the number of steps into which the original task is divided.

Spatial parallelism operates in a different way. Here the component used to carry out the processing task is not subdivided but is instead replicated, so that each unit of information (x1 - xm in the Spatial Parallelism diagram) is processed by a its own dedicated component. To exploit this form of parallelism, the units of information processed by the original (non-parallel) component must be partitionable. In other words the task space must be parallelised. A typical example from the sphere of ordinary human activity is the familiar row of checkout desks found in supermarkets.

The amount of parallelism that can be exploited using spatial parallelism depends only on the number of independent tasks, whereas the amount that can be exploited using temporal parallelism depends on the divisibility of the task being parallelised.