Aviva Directory » Computers & Internet » Parallel Computing

Parallel computing is a form of computation in which numerous calculations are carried out at the same time, accomplished by dividing the workload between more than one processor.

The focus of this category is on parallel computing, which is sometimes referred to as parallel processing.

In parallel computing, one task is split into several subtasks, which are assigned to multiple processors for the purpose of obtaining a more rapid result through the use of coordination mechanisms.

In serial programming, a single processor (CPU) executes instructions in a step-by-step manner. This works fine for most purposes, but some operations include multiple steps that can be separated into multiple tasks that can be executed simultaneously. Elements in the matrix can be allocated to several processors, with the results available faster than if all of the operations had been performed serially.

Parallel computations can be performed on shared-memory systems with multiple CPUs, distributed-memory clusters comprised of smaller shared-memory systems or single CPU systems. The coordination of the concurrent work of the multiple processors, and the synchronization of the results, are handled through program calls to parallel libraries

Groups of networked computers that have the same goal for their work are known as distributed systems, but it might also be known as a parallel system, as the processors in a distributed system run concurrently in parallel. Parallel computing, then, is a form of distributed computing, while distributed computing may be considered a loosely coupled form of parallel computing.

Nevertheless, the two can be differentiated. In parallel computing, all processors must have access to shared memory in order to exchange information between the processors while, in distributed computing, each processor has its own private memory.

Other forms of parallel computing include multicore computing, which includes multiple execution units on the same chip. Symmetric multiprocessing refers to a computer system with multiple identical processors that share memory and are connected through a bus. Due to the limitations of bus architecture, SMPs do not include more than thirty-two processors. Cluster computing refers to a group of loosely coupled computers that work together so closely that, in many respects, they can be considered a single computer, known as a supercomputer.

Another type of supercomputer is one that uses several networked processors, a system that is known as massive parallel processing. MPPs have many of the same characteristics as clusters, but MPPs have specialized interconnect networks, while clusters use commodity hardware for networking. Grid computing uses middleware, which is software that sits between the operating system and the application, managing network resources.

Different types of parallel computing include bit-level parallelization, data parallelization, and function parallelization.

In computing, word size refers to the maximum number of bits that a CPU can process at a time. Bit-level parallelism is based on increasing the processor word size. During the 1970s and 1980s, advancements in computer architecture came about through increasing bit-level parallelism. Data parallelism is the parallelization of computing across more than one processor in a parallel computing environment, with a focus on distributing data across parallel computing nodes. Function parallelization refers to the parallelization of computing across more than one processor in a parallel computing environment, the focus being on the distribution of data across multiple parallel computing nodes.

Parallel computing allows its users to save time and to be able to solve larger problems. Single computers are limited by finite memory resources, which may be overcome by using the combined memory resources in multiple computers.

Resources listed in this category should relate to parallel computing.



Recommended Resources

Search for Parallel Computing on Google or Bing