Framebuffer: Difference between revisions

Content deleted Content added
See also: Added Tektronix 4050
Tags: Mobile edit Mobile web edit Advanced mobile edit
m unpiped links using script
Line 2:
[[File:Sun sbus cgsix framebuffer.jpg|thumb|Sun TGX Framebuffer]]
 
A '''framebuffer''' ('''frame buffer''', or sometimes '''framestore''') is a portion of [[random-access memory]] (RAM)<ref>{{cite web|url=http://www.webopedia.com/TERM/F/frame_buffer.html|title=What is frame buffer? A Webopedia Definition|work=webopedia.com|date=June 1998 }}</ref> containing a [[bitmap]] that drives a video display. It is a [[Data buffer|memory buffer]] containing data representing all the [[pixel]]s in a complete [[video frame]].<ref>{{cite web |url=http://www.sunhelp.org/faq/FrameBuffer.html#00 |title=Frame Buffer FAQ |access-date=14 May 2014 }}</ref> Modern [[video card]]s contain framebuffer circuitry in their cores. This circuitry converts an in-memory bitmap into a [[video signal]] that can be displayed on a computer monitor.
 
In [[computing]], a '''screen buffer''' is a part of [[computer memory]] used by a computer application for the representation of the content to be shown on the [[computer display]].<ref name="google">{{cite book|title=.NET Framework Solutions: In Search of the Lost Win32 API|author=Mueller, J.|date=2002|publisher=Wiley|isbn=9780782141344|url=https://books.google.com/books?id=XYQruTc6_44C|page=160|access-date=2015-04-21}}</ref> The screen buffer may also be called the '''video buffer''', the '''regeneration buffer''', or '''regen buffer''' for short.<ref name="smartcomputing">{{cite web|url=http://www.smartcomputing.com/editorial/dictionary/detail.asp?searchtype=2&DicID=10421&RefType=Dictionary&guid=|archive-url=https://web.archive.org/web/20120324192310/http://www.smartcomputing.com/editorial/dictionary/detail.asp?searchtype=2&DicID=10421&RefType=Dictionary&guid= |archive-date=2012-03-24 |url-status=dead|title=Smart Computing Dictionary Entry - video buffer|access-date=2015-04-21}}</ref> Screen buffers should be distinguished from [[video memory]]. To this end, the term '''off-screen buffer''' is also used.
 
The information in the buffer typically consists of color values for every pixel to be shown on the display. Color values are commonly stored in 1-bit [[binary image|binary]] (monochrome), 4-bit [[palette (computing)|palettized]], 8-bit palettized, 16-bit [[high color]] and 24-bit [[Color depth#True color .2824-bit.29|true color]] formats. An additional [[Alpha compositing|alpha channel]] is sometimes used to retain information about pixel transparency. The total amount of memory required for the framebuffer depends on the [[Display resolution|resolution]] of the output signal, and on the [[color depth]] or palette size.
 
== History ==
Line 16:
In the early 1970s, the development of [[MOS memory]] ([[metal–oxide–semiconductor]] memory) [[Integrated circuit|integrated-circuit]] chips, particularly [[large-scale integration|high-density]] [[DRAM]] (dynamic [[random-access memory]]) chips with at least 1{{nbsp}}[[kibibit|kb]] memory, made it practical to create, for the first time, a [[digital memory]] system with framebuffers capable of holding a standard video image.<ref name="Shoup_SuperPaint"/><ref>{{cite conference |last1=Goldwasser |first1=S.M. |title=Computer Architecture For Interactive Display Of Segmented Imagery |conference=Computer Architectures for Spatially Distributed Data |date=June 1983 |publisher=[[Springer Science & Business Media]] |isbn=9783642821509 |pages=75–94 (81) |url=https://books.google.com/books?id=8MuoCAAAQBAJ&pg=PA81}}</ref> This led to the development of the [[SuperPaint]] system by [[Richard Shoup (programmer)|Richard Shoup]] at [[Xerox PARC]] in 1972.<ref name="Shoup_SuperPaint">{{cite web |url=http://accad.osu.edu/~waynec/history/PDFs/Annals_final.pdf |archive-url=https://web.archive.org/web/20040612215245/http://accad.osu.edu/~waynec/history/PDFs/Annals_final.pdf |archive-date=2004-06-12 |title=SuperPaint: An Early Frame Buffer Graphics System |author=Richard Shoup |publisher=IEEE |work=Annals of the History of Computing |year=2001 |url-status=dead }}</ref> Shoup was able to use the SuperPaint framebuffer to create an early digital video-capture system. By synchronizing the output signal to the input signal, Shoup was able to overwrite each pixel of data as it shifted in. Shoup also experimented with modifying the output signal using color tables. These color tables allowed the SuperPaint system to produce a wide variety of colors outside the range of the limited 8-bit data it contained. This scheme would later become commonplace in computer framebuffers.
 
In 1974, [[Evans & Sutherland]] released the first commercial framebuffer, the Picture System,<ref>{{citation |title=Picture System |url=http://s3data.computerhistory.org/brochures/evanssutherland.3d.1974.102646288.pdf |publisher=Evans & Sutherland |access-date=2017-12-31}}</ref> costing about $15,000. It was capable of producing resolutions of up to 512 by 512 pixels in 8-bit [[grayscale]], and became a boon for graphics researchers who did not have the resources to build their own framebuffer. The [[New York Institute of Technology]] would later create the first 24-bit color system using three of the Evans & Sutherland framebuffers.<ref name="NYIT-History">{{cite web |url=https://www.cs.cmu.edu/~ph/nyit/masson/nyit.html |title=History of the New York Institute of Technology Graphics Lab |access-date=2007-08-31}}</ref> Each framebuffer was connected to an [[RGB color model|RGB]] color output (one for red, one for green and one for blue), with a Digital Equipment Corporation PDP 11/04 [[minicomputer]] controlling the three devices as one.
 
In 1975, the UK company [[Quantel]] produced the first commercial full-color broadcast framebuffer, the Quantel DFS 3000. It was first used in TV coverage of the [[1976 Summer Olympics|1976 Montreal Olympics]] to generate a [[picture-in-picture]] inset of the Olympic flaming torch while the rest of the picture featured the runner entering the stadium.
 
The rapid improvement of integrated-circuit technology made it possible for many of the home computers of the late 1970s to contain low-color-depth framebuffers. Today, nearly all computers with graphical capabilities utilize a framebuffer for generating the video signal. [[Amiga]] computers, created in the 1980s, featured special design attention to graphics performance and included a unique [[Hold-And-Modify]] framebuffer capable of displaying 4096 colors.
Line 76:
As the demand for better graphics increased, hardware manufacturers created a way to decrease the amount of [[CPU]] time required to fill the framebuffer. This is commonly called ''graphics acceleration''. Common graphics drawing commands (many of them geometric) are sent to the graphics accelerator in their raw form. The accelerator then [[Rasterisation|rasterizes]] the results of the command to the framebuffer. This method frees the CPU to do other work.
 
Early accelerators focused on improving the performance of 2D [[Graphical user interface|GUI]] systems. While retaining these 2D capabilities, most modern accelerators focus on producing 3D imagery in real time. A common design uses a [[graphics library]] such as [[OpenGL]] or [[Direct3D]] which interfaces with the graphics driver to translate received commands to instructions for the accelerator's [[graphics processing unit]] (GPU). The GPU uses those instructions to compute the rasterized results and the results are [[bit blit]]ted to the framebuffer. The framebuffer's signal is then produced in combination with built-in video overlay devices (usually used to produce the mouse cursor without modifying the framebuffer's data) and any final special effects that are produced by modifying the output signal. An example of such final special effects was the [[spatial anti-aliasing]] technique used by the [[3dfx Voodoo]] cards. These cards add a slight blur to output signal that makes aliasing of the rasterized graphics much less obvious.
 
At one time there were many manufacturers of graphics accelerators, including: [[3dfx Interactive]]; [[ATI Technologies|ATI]]; [[Hercules Computer Technology|Hercules]]; [[Trident Microsystems|Trident]]; [[Nvidia]]; [[Radius (hardware company)|Radius]]; [[S3 Graphics]]; [[Silicon Integrated Systems|SiS]] and [[Silicon Graphics]]. {{as of|2015}} the market for graphics accelerators for x86-based systems is dominated by Nvidia (acquired 3dfx in 2002), [[AMD]] (who acquired ATI in 2006), and [[Intel]].
Line 83:
With a framebuffer, the electron beam (if the display technology uses one) is commanded to perform a [[raster scan]], the way a [[television]] renders a broadcast signal. The color information for each point thus displayed on the screen is pulled directly from the framebuffer during the scan, creating a set of discrete picture elements, i.e. pixels.
 
Framebuffers differ significantly from the [[vector display]]s that were common prior to the advent of raster graphics (and, consequently, to the concept of a framebuffer). With a vector display, only the [[vertex (geometry)|vertices]] of the graphics primitives are stored. The [[Cathode ray|electron beam]] of the output display is then commanded to move from vertex to vertex, tracing a line across the area between these points.
 
Likewise, framebuffers differ from the technology used in early [[text mode]] displays, where a buffer holds codes for characters, not individual pixels. The video display device performs the same raster scan as with a framebuffer, but generates the pixels of each character in the buffer as it directs the beam.