During most of the the [[1980]]s, hardware speed was in the low, single-digit [[megahertz]] and [[memory (computer)|memory]] was mere [[kilobytes]]. With these constraints video game programmers resorted to extreme measures to speed up the process of writing bitmaps onto the display. A sprite engine is [[hardwired]] into a computer or videogame system's architecture. The [[central processing unit|central processor]] can instruct the engine to fetch source images and integrate them into the main screen using [[direct memory access]] channels. (This is related to what a [[genlock]] does with video sources and to a [[playfield (computer science)|playfield]]). Calling up sprite hardware, instead of using the processor alone, greatly improved graphics performance. Because the processor is not occupied by the simple task of transfering data from one place to another, software can run faster; and because the hardware provided certain innate abilities, programs that used sprites were also smaller.
Separate locations in memory were used to hold the main display and the sprites, which were composited together into the display in two passes. This placed the sprites on the display without interferring with the "background" image, making them easy to move around the display. Examples of such systems include the [[Atari 8-bit]] machines (which referred to them as ''player/missile graphics'') and the [[Commodore 64]].
Line 66:
* [[MOS Technology VIC-II]] and the [[ANTIC|Atari ANTIC]] were well-featured sprite-handling chips of the 8-bit era
* The [[Original Amiga chipset|Amiga custom chip set]] carried the torch on to the 16 and 32 bit systems.