NAS Parallel Benchmarks: Difference between revisions

Content deleted Content added
added info on npb 2
No edit summary
Line 1:
The '''NAS Parallel Benchmarks''' ('''NPB''') are a set of [[benchmark (computing)|benchmark]]s targetting performance evaluation of highly parallel [[supercomputer]]s. They are developed and maintained by the [[NASA Advanced Supercomputing facility|NASA Advanced Supercomputing Division]] (NAS) Division]] (formerly the [[NASA]] Numerical Aerodynamic Simulation Program) based at the [[NASA Ames Research Center]]. NAS solicits performance results for NPB from all sources.<ref name="npbweb">{{cite web
|title=NAS Parallel Benchmarks Changes
|url=http://www.nas.nasa.gov/Resources/Software/npb.html
Line 8:
==History==
===Motivation===
Traditional benckmarks that existed before NPB, such as the [[Livermore loops]], the [[LINPACK|LINPACK Benchmark]] and the [[NAS Kernel Benchmark Program]], were usually specialized for vector computers. They generallly suffered from inadequacies including parallelism-impeding tuning restrictions and insufficient problem sizes, which rendered them inappropriate for highly parallel systems. Equally unsuitable were full-scale application benchmarks due to high porting cost and unavailability of automatic software parallelization tools.<ref name="rnr94007">{{Citation
|last1=Baily|first1=D.
|last2=Barscz|first2=E.
Line 26:
|contribution-url=http://www.nersc.gov/~simon/Papers/NASA/RNR-94-007.pdf
|title=NAS Technical Report RNR-94-007
|publisher=NASA Ames Research Center, Moffett Field, CA
|year=March 1994
}}</ref> As a result, NPB were released in 1991 to address the ensuing lack of benchmarks applicable to highly parallel machines.
Line 36 ⟶ 37:
* capability of accomodating new systems with increased power,
* and ready distributability.
In the light of these guidelines, it was deemed the only viable approach to use a collection of "paper and pencil" benchmarks that specified a set of problems only algorithmically and left most implementation details to the implementor's discretion under certain necessary limits.
 
InNPB the1 lightdefined ofeight these guidelinesbenchmarks, iteach wasin deemedtwo theproblem only viable approach to use a collection of "paper and pencil" benchmarks that specified a set of problems only algorithmically and left most implementation details to the implementor's discretion under certain necessary limitssizes. Sample codescode written in [[Fortran 77]] werewas supplied but not intended for benchmarking purposes.<ref name="rnr94007"/>
 
===NPB 2===
Since its release, NPB 1 displayed two major weaknesses. Firstly, due to its "paper and pencil" style of specification, computer vendors usually highly tuned their implementations so that their performance became difficult for scienctific programmers to attain. Moreover, many of these implementation were proprietary and not publicly available, effectively concealing their optimizing techniques. Secondly, problem sizes of NPB 1 lagged behind the development of supercomputers as the latter continued to evolve.<ref name="nas95020">{{Citation
|last1=Bailey|first1=D.
|last2=Harris|first2=T.
Line 50 ⟶ 52:
|contribution-url=http://www.nas.nasa.gov/News/Techreports/1995/PDF/nas-95-020.pdf
|title=NAS Technical Report NAS-95-020
|publisher=NASA Ames Research Center, Moffett Field, CA
|year=December 1995
}}
</ref>
 
NPB 2, released in 1996<ref name=npb2.2>{{Citation
|last1=Saphir|first1=W.
|last2=van der Wijngaart|first2=R.
|last3=Woo|first3=A.
|last4=Yarrow|first4=M.
|contribution=New Implementations and Results for the NAS Parallel Benchmarks 2
|contribution-url=http://www.nas.nasa.gov/Resources/Software/npb_2.2.pdf
|publisher=NASA Ames Research Center, Moffett Field, CA
}}
</ref><ref name="nas02007">{{Citation
|last1=van der Wijngaart|first1=R.
|contribution=NAS Parallel Benchmarks Version 2.4
|contribution-url=http://www.nas.nasa.gov/News/Techreports/2002/PDF/nas-02-007.pdf
|title=NAS Technical Report NAS-02-007
|year=October 2002
|publisher=NASA Ames Research Center, Moffett Field, CA
}}
</ref>, came with source code implementations for five out of eight benchmarks defined in NPB 1 to supplement but not replace NPB 1. It updated the benchmarks with a new, up-to-date problem size. It also amended the rules for submitting benchmarking results. The new rules included explicit requests for output files as well as modified source files and build scripts to ensure public availability of the modifications and reproducibility of the results.<ref name=nas95020/>
 
==References==