'''Massive parallelism''' is a term used in [[computer architecture]], [[reconfigurable computing]], [[application-specific integrated circuit]] (ASIC) and [[field-programmable gate array]] (FPGA) design. It signifies the presence of many independent [[arithmetic]] units or entire [[microprocessor]]s, that run in parallel. Early examples of such a system are the [[Distributed Array Processor]], the [[Goodyear MPP]], and the [[Connection Machine]].
{{Redirect category shell|1=
Today's most powerful [[supercomputer]]s are all ''MPP'' systems such as [[Earth Simulator]], [[Blue Gene]], [[ASCI White]], [[ASCI Red]], [[ASCI Purple]], [[ASCI Thor's Hammer]].
{{R from merge}}
}}
In this class of computing, all of the processing elements are connected together to be one very large computer. This is in contrast to [[distributed computing]] where massive numbers of separate computers are used to solve a single problem.