Content deleted Content added
→See also: link |
m Disambiguating links to General-purpose computing on graphics processing units (link changed to General-purpose computing on graphics processing units (software)) using DisamAssist. |
||
(5 intermediate revisions by 5 users not shown) | |||
Line 2:
In [[computer science]], '''array programming''' refers to solutions that allow the application of operations to an entire set of values at once. Such solutions are commonly used in [[computational science|scientific]] and engineering settings.
Modern programming languages that support array programming (also known as [[vector (data structure)|vector]] or [[multidimensional analysis|multidimensional]] languages) have been engineered specifically to generalize operations on [[scalar (computing)|scalar]]s to apply transparently to [[vector (geometric)|vector]]s, [[matrix (mathematics)|matrices]], and higher-dimensional arrays. These include [[APL (programming language)|APL]], [[J (programming language)|J]], [[Fortran]], [[MATLAB]], [[Analytica (software)|Analytica]], [[GNU Octave|Octave]], [[R (programming language)|R]], [[Cilk Plus]], [[Julia (programming language)|Julia]], [[Perl Data Language|Perl Data Language (PDL)]] and [[Raku (programming language)|Raku]]. In these languages, an operation that operates on entire arrays can be called a ''vectorized'' operation,<ref>{{cite journal |title=The NumPy array: a structure for efficient numerical computation |author=Stéfan van der Walt |author2=S. Chris Colbert |author3=Gaël Varoquaux |name-list-style=amp |journal=Computing in Science and Engineering |volume=13 |issue=2 |pages=22–30 |publisher=IEEE |year=2011 |doi=10.1109/mcse.2011.37|bibcode=2011CSE....13b..22V |arxiv=1102.1523 |s2cid=16907816 }}</ref> regardless of whether it is executed on a [[vector processor]], which implements vector instructions. Array programming primitives concisely express broad ideas about data manipulation. The level of concision can be dramatic in certain cases: it is not uncommon{{Example needed|date=September 2021}} to find array programming language [[one-liner program|one-liners]] that require several pages of object-oriented code.
==Concepts of array==
Line 21:
==Uses==
Array programming is very well suited to [[implicit parallelization]]; a topic of much research nowadays. Further, [[Intel]] and compatible CPUs developed and produced after 1997 contained various instruction set extensions, starting from [[MMX (instruction set)|MMX]] and continuing through [[SSSE3]] and [[3DNow!]], which include rudimentary [[Single instruction, multiple data|SIMD]] array capabilities. This has continued into the 2020s with instruction sets such as [[AVX-512]], making modern CPUs sophisticated vector processors. Array processing is distinct from [[parallel computing|parallel processing]] in that one physical processor performs operations on a group of items simultaneously while parallel processing aims to split a larger problem into smaller ones ([[Multiple instruction, multiple data|MIMD]]) to be solved piecemeal by numerous processors. Processors with [[Multi-core processor|multiple cores]] and [[Graphics processing unit|GPU]]s with thousands of [[General-purpose computing on graphics processing units (software)|general computing cores]] are common as of 2023.
==Languages==
Line 283:
==Third-party libraries==
The use of specialized and efficient libraries to provide more terse abstractions is also common in other programming languages. In [[C++]] several linear algebra libraries exploit the language's ability to [[operator overloading|overload operators]]. In some cases a very terse abstraction in those languages is explicitly influenced by the array programming paradigm, as the [[NumPy]] extension library to [[Python (programming language)|Python]], [[Armadillo (C++ library)|Armadillo]] and [[Blitz++]] libraries do.<ref>{{cite web |title= Reference for Armadillo 1.1.8. Examples of Matlab/Octave syntax and conceptually corresponding Armadillo syntax. |url=
==See also==
|