Explicit parallelism: Difference between revisions

Content deleted Content added
WP:FEB24 +ref ce
Tags: Mobile edit Mobile web edit Advanced mobile edit
ce
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 3:
{{Use mdy dates |date=February 2024}}
{{More citations needed |date=February 2024}}
In [[computer programming]], '''explicit parallelism''' is the representation of concurrent computations by means of using primitives in the form of operators, function calls or special-purpose directives. Most parallel primitives are related to process synchronization, communication orand taskprocess partitioning.<ref name="dij68" /> As they seldom contribute to actually carry out the intended computation of the program but, rather, structure it, their computational cost is often considered as overhead.
 
The advantage of explicit [[parallel programming]] is theincreased programmer
control over the parallel executioncomputation. A skilled parallel programmer may take advantage of explicit parallelism to produce efficient code for a given target computation environment. However, programming with explicit parallelism is often difficult, especially for non-computing specialists, because of the extra work and skill involved in developing it.
non-computing specialists, because of the extra work involved in planning
the task division and synchronization of concurrent processes.
 
In some instances, explicit parallelism may be avoided with the use of an optimizing compiler or runtime that automatically deduces the parallelism inherent to computations, known as [[implicit parallelism]].
 
==Programming languages that support explicit parallelism==