Content deleted Content added
GreenC bot (talk | contribs) Rescued 1 archive link. Wayback Medic 2.5 per WP:URLREQ#ieee.org |
m grammar |
||
(7 intermediate revisions by 5 users not shown) | |||
Line 1:
{{short description|Portion of random-access memory containing a bitmap that drives a video display}}
{{Use American English|date=November 2024}}
[[File:Sun sbus cgsix framebuffer.jpg|thumb|Sun TGX Framebuffer]]
Line 10 ⟶ 12:
== History ==
[[File:SWAC 003.jpg|thumb|Memory pattern on [[SWAC (computer)|SWAC]] Williams tube CRT in 1951]]
Computer researchers{{who|date=July 2017}} had long discussed the theoretical advantages of a framebuffer
A color
In the early 1970s, the development of [[MOS memory]] ([[metal–oxide–semiconductor]] memory) [[Integrated circuit|integrated-circuit]] chips, particularly [[large-scale integration|high-density]] [[DRAM]] (dynamic [[random-access memory]]) chips with at least 1{{nbsp}}[[kibibit|kb]] memory, made it practical to create, for the first time, a [[digital memory]] system with framebuffers capable of holding a standard video image.<ref name="Shoup_SuperPaint"/><ref>{{cite conference |last1=Goldwasser |first1=S.M. |title=Computer Architecture For Interactive Display Of Segmented Imagery |conference=Computer Architectures for Spatially Distributed Data |date=June 1983 |publisher=[[Springer Science & Business Media]] |isbn=9783642821509 |pages=75–94 (81) |url=https://books.google.com/books?id=8MuoCAAAQBAJ&pg=PA81}}</ref> This led to the development of the [[SuperPaint]] system by [[Richard Shoup (programmer)|Richard Shoup]] at [[Xerox PARC]] in 1972.<ref name="Shoup_SuperPaint">{{cite web |url=
In 1974, [[Evans & Sutherland]] released the first commercial framebuffer, the Picture System,<ref>{{citation |title=Picture System |url=http://s3data.computerhistory.org/brochures/evanssutherland.3d.1974.102646288.pdf |publisher=Evans & Sutherland |access-date=2017-12-31}}</ref> costing about $15,000. It was capable of producing resolutions of up to 512 by 512 pixels in 8-bit [[grayscale]], and became a boon for graphics researchers who did not have the resources to build their own framebuffer. The [[New York Institute of Technology]] would later create the first 24-bit color system using three of the Evans & Sutherland framebuffers.<ref name="NYIT-History">{{cite web |url=https://www.cs.cmu.edu/~ph/nyit/masson/nyit.html |title=History of the New York Institute of Technology Graphics Lab |access-date=2007-08-31}}</ref> Each framebuffer was connected to an [[RGB]] color output (one for red, one for green and one for blue), with a Digital Equipment Corporation PDP 11/04 [[minicomputer]] controlling the three devices as one.
Line 28 ⟶ 30:
Framebuffers used in personal and home computing often had sets of defined ''modes'' under which the framebuffer can operate. These modes reconfigure the hardware to output different resolutions, color depths, memory layouts and [[refresh rate]] timings.
In the world of [[Unix]] machines and operating systems, such conveniences were usually eschewed in favor of directly manipulating the hardware settings. This manipulation was far more flexible in that any resolution, color depth and refresh rate was attainable
An unfortunate side-effect of this method was that the [[display device]] could be driven beyond its capabilities. In some cases, this resulted in hardware damage to the display.<ref>http://tldp.org/HOWTO/XFree86-Video-Timings-HOWTO/overd.html XFree86 Video Timings HOWTO: Overdriving Your Monitor</ref> More commonly, it simply produced garbled and unusable output. Modern CRT monitors fix this problem through the introduction of protection circuitry. When the display mode is changed, the monitor attempts to obtain a signal lock on the new refresh frequency. If the monitor is unable to obtain a signal lock
LCD monitors tend to contain similar protection circuitry, but for different reasons. Since the LCD must digitally sample the display signal (thereby emulating an electron beam), any signal that is out of range cannot be physically displayed on the monitor.
== Color palette ==
Framebuffers have traditionally supported a wide variety of color modes. Due to the expense of memory, most early framebuffers used 1-bit (2 colors per pixel), 2-bit (4 colors), 4-bit (16 colors) or 8-bit (256 colors) color depths. The problem with such small color depths is that a full range of colors cannot be produced. The solution to this problem was [[indexed color]], which adds a [[lookup table]] to the framebuffer. Each color stored in framebuffer memory acts as a color index. The lookup table serves as a palette with a limited number of different colors,
Here is a typical indexed 256-color image and its own palette (shown as a rectangle of swatches):
:{| style="border-style: none" border="0" cellpadding="0"
Line 47 ⟶ 48:
|}
In some designs it was also possible to write data to the lookup table (or switch between existing palettes) on the
== Memory access ==
Line 67 ⟶ 68:
== Page flipping ==
A frame buffer may be designed with enough memory to store two frames' worth of video data. In a technique known generally as [[double buffering]] or more specifically as [[page flipping]], the framebuffer uses half of its memory to display the current frame. While that memory is being displayed, the other half of memory is filled with data for the next frame. Once the secondary buffer is filled, the framebuffer is instructed to display the secondary buffer instead. The primary buffer becomes the secondary buffer, and the secondary buffer becomes the primary. This switch is often done after the [[vertical blanking interval]] to avoid [[screen tearing]] where half the old frame and half the new frame is shown together.
Page flipping has become a standard technique used by PC [[game programmer]]s.
Line 76 ⟶ 77:
As the demand for better graphics increased, hardware manufacturers created a way to decrease the amount of [[CPU]] time required to fill the framebuffer. This is commonly called ''graphics acceleration''. Common graphics drawing commands (many of them geometric) are sent to the graphics accelerator in their raw form. The accelerator then [[Rasterisation|rasterizes]] the results of the command to the framebuffer. This method frees the CPU to do other work.
Early accelerators focused on improving the performance of 2D [[GUI]] systems. While retaining these 2D capabilities, most modern accelerators focus on producing 3D imagery in real time. A common design uses a [[graphics library]] such as [[OpenGL]] or [[Direct3D]] which interfaces with the graphics driver to translate received commands to instructions for the accelerator's [[graphics processing unit]] (GPU). The GPU uses those instructions to compute the rasterized results and the results are [[bit blit]]ted to the framebuffer. The framebuffer's signal is then produced in combination with built-in video overlay devices (usually used to produce the mouse cursor without modifying the framebuffer's data) and any final special effects that are produced by modifying the output signal. An example of such final special effects was the [[spatial anti-aliasing]] technique used by the [[3dfx Voodoo]] cards. These cards add a slight blur to the output signal that makes aliasing of the rasterized graphics much less obvious.
At one time there were many manufacturers of graphics accelerators, including: [[3dfx Interactive]]; [[ATI Technologies|ATI]]; [[Hercules Computer Technology|Hercules]]; [[Trident Microsystems|Trident]]; [[Nvidia]]; [[Radius (hardware company)|Radius]]; [[S3 Graphics]]; [[Silicon Integrated Systems|SiS]] and [[Silicon Graphics]]. {{as of|2015}} the market for graphics accelerators for x86-based systems is dominated by Nvidia (acquired 3dfx in 2002), [[AMD]] (who acquired ATI in 2006), and [[Intel]].
==Comparisons==
With a framebuffer, the electron beam (if the display technology uses one) is commanded to perform a [[raster scan]], the way a [[television]] renders a broadcast signal. The color information for each point thus displayed on the screen is pulled directly from the framebuffer during the scan, creating a set of discrete picture elements, i.e., pixels.
Framebuffers differ significantly from the [[vector display]]s that were common prior to the advent of raster graphics (and, consequently, to the concept of a framebuffer). With a vector display, only the [[vertex (geometry)|vertices]] of the graphics primitives are stored. The [[electron beam]] of the output display is then commanded to move from vertex to vertex, tracing a line across the area between these points.
Likewise, framebuffers differ from the technology used in early [[text mode]] displays, where a buffer holds codes for characters, not individual pixels. The video display device performs the same raster scan as with a framebuffer
== See also ==
|