Content deleted Content added
Line 14:
A color scanned display was implemented in the late 1960s, called the [[Brookhaven National Laboratory|Brookhaven]] RAster Display (BRAD), which used a [[drum memory]] and a television monitor.<ref>{{citation |author1=D. Ophir |author2=S. Rankowitz |author3=B. J. Shepherd |author4=R. J. Spinrad |title=BRAD: The Brookhave Raster Display |work=Communications of the ACM |volume=11 |number=6 |date=June 1968 |pages=415–416 |doi=10.1145/363347.363385|s2cid=11160780 |doi-access=free }}</ref> In 1969, A. Michael Noll of [[Bell Labs]] implemented a scanned display with a frame buffer, using [[magnetic-core memory]].<ref>{{cite journal |last=Noll |first=A. Michael |title=Scanned-Display Computer Graphics |journal=Communications of the ACM |volume=14 |number=3 |date=March 1971 |pages=145–150 |doi=10.1145/362566.362567|s2cid=2210619 |doi-access=free }}</ref> Later on, the Bell Labs system was expanded to display an image with a color depth of three bits on a standard color TV monitor.
In the early 1970s, the development of [[MOS memory]] ([[metal–oxide–semiconductor]] memory) [[Integrated circuit|integrated-circuit]] chips, particularly [[large-scale integration|high-density]] [[DRAM]] (dynamic [[random-access memory]]) chips with at least 1{{nbsp}}[[kibibit|kb]] memory, made it practical to create, for the first time, a [[digital memory]] system with framebuffers capable of holding a standard video image.<ref name="Shoup_SuperPaint"/><ref>{{cite conference |last1=Goldwasser |first1=S.M. |title=Computer Architecture For Interactive Display Of Segmented Imagery |conference=Computer Architectures for Spatially Distributed Data |date=June 1983 |publisher=[[Springer Science & Business Media]] |isbn=9783642821509 |pages=
In 1974, [[Evans & Sutherland]] released the first commercial framebuffer, the Picture System,<ref>{{citation |title=Picture System |url=http://s3data.computerhistory.org/brochures/evanssutherland.3d.1974.102646288.pdf |publisher=Evans & Sutherland |access-date=2017-12-31}}</ref> costing about $15,000. It was capable of producing resolutions of up to 512 by 512 pixels in 8-bit [[grayscale]], and became a boon for graphics researchers who did not have the resources to build their own framebuffer. The [[New York Institute of Technology]] would later create the first 24-bit color system using three of the Evans & Sutherland framebuffers.<ref name="NYIT-History">{{cite web |url=https://www.cs.cmu.edu/~ph/nyit/masson/nyit.html |title=History of the New York Institute of Technology Graphics Lab |access-date=2007-08-31}}</ref> Each framebuffer was connected to an [[RGB color model|RGB]] color output (one for red, one for green and one for blue), with a Digital Equipment Corporation PDP 11/04 [[minicomputer]] controlling the three devices as one.
Line 83:
With a framebuffer, the electron beam (if the display technology uses one) is commanded to perform a [[raster scan]], the way a [[television]] renders a broadcast signal. The color information for each point thus displayed on the screen is pulled directly from the framebuffer during the scan, creating a set of discrete picture elements, i.e. pixels.
Framebuffers differ significantly from the [[vector display]]s that were common prior to the advent of raster graphics (and, consequently, to the concept of a framebuffer). With a vector display, only the [[vertex (geometry)|vertices]] of the graphics primitives are stored. The [[Cathode ray|electron beam]] of the output display is then commanded to move from vertex to vertex, tracing a line across the area between these points.
Likewise, framebuffers differ from the technology used in early [[text mode]] displays, where a buffer holds codes for characters, not individual pixels. The video display device performs the same raster scan as with a framebuffer, but generates the pixels of each character in the buffer as it directs the beam.
|