en.wikipedia.org, 2022-12-29 | Main page
---------------------------------------------------------------------------------------
Saved from web.archive.org, with Lynx.
---------------------------------------------------------------------------------------
   #alternate Edit this page Wikipedia (en)

Framebuffer

   From Wikipedia, the free encyclopedia
   Jump to navigation Jump to search
   Portion of random-access memory containing a bitmap that drives a video
   display
   Sun TGX Framebuffer

   A framebuffer (frame buffer, or sometimes framestore) is a portion of
   random-access memory (RAM)^[1] containing a bitmap that drives a video
   display. It is a memory buffer containing data representing all the
   pixels in a complete video frame.^[2] Modern video cards contain
   framebuffer circuitry in their cores. This circuitry converts an
   in-memory bitmap into a video signal that can be displayed on a
   computer monitor.

   In computing, a screen buffer is a part of computer memory used by a
   computer application for the representation of the content to be shown
   on the computer display.^[3] The screen buffer may also be called the
   video buffer, the regeneration buffer, or regen buffer for short.^[4]
   Screen buffers should be distinguished from video memory. To this end,
   the term off-screen buffer is also used.

   The information in the buffer typically consists of color values for
   every pixel to be shown on the display. Color values are commonly
   stored in 1-bit binary (monochrome), 4-bit palettized, 8-bit
   palettized, 16-bit high color and 24-bit true color formats. An
   additional alpha channel is sometimes used to retain information about
   pixel transparency. The total amount of memory required for the
   framebuffer depends on the resolution of the output signal, and on the
   color depth or palette size.
   [ ]

Contents

     * 1 History
     * 2 Display modes
     * 3 Color palette
     * 4 Memory access
     * 5 RAM on the video card
     * 6 Virtual framebuffers
     * 7 Page flipping
     * 8 Graphics accelerators
     * 9 Comparisons
     * 10 See also
     * 11 References
     * 12 External links

History[edit]

   Memory pattern on SWAC Williams tube CRT in 1951

   Computer researchers^[who?] had long discussed the theoretical
   advantages of a framebuffer, but were unable to produce a machine with
   sufficient memory at an economically practicable cost.^[citation
   needed]^[5] In 1947, the Manchester Baby computer used a Williams tube,
   later the Williams-Kilburn tube, to store 1024 bits on a cathode-ray
   tube (CRT) memory and displayed on a second CRT.^[6]^[7] Other research
   labs were exploring these techniques with MIT Lincoln Laboratory
   achieving a 4096 display in 1950.^[5]

   A color scanned display was implemented in the late 1960s, called the
   Brookhaven RAster Display (BRAD), which used a drum memory and a
   television monitor.^[8] In 1969, A. Michael Noll of Bell Labs
   implemented a scanned display with a frame buffer, using magnetic-core
   memory.^[9] Later on, the Bell Labs system was expanded to display an
   image with a color depth of three bits on a standard color TV monitor.

   In the early 1970s, the development of MOS memory
   (metal-oxide-semiconductor memory) integrated-circuit chips,
   particularly high-density DRAM (dynamic random-access memory) chips
   with at least 1 kb memory, made it practical to create, for the first
   time, a digital memory system with framebuffers capable of holding a
   standard video image.^[10]^[11] This led to the development of the
   SuperPaint system by Richard Shoup at Xerox PARC in 1972.^[10] Shoup
   was able to use the SuperPaint framebuffer to create an early digital
   video-capture system. By synchronizing the output signal to the input
   signal, Shoup was able to overwrite each pixel of data as it shifted
   in. Shoup also experimented with modifying the output signal using
   color tables. These color tables allowed the SuperPaint system to
   produce a wide variety of colors outside the range of the limited 8-bit
   data it contained. This scheme would later become commonplace in
   computer framebuffers.

   In 1974, Evans & Sutherland released the first commercial framebuffer,
   the Picture System,^[12] costing about $15,000. It was capable of
   producing resolutions of up to 512 by 512 pixels in 8-bit grayscale,
   and became a boon for graphics researchers who did not have the
   resources to build their own framebuffer. The New York Institute of
   Technology would later create the first 24-bit color system using three
   of the Evans & Sutherland framebuffers.^[13] Each framebuffer was
   connected to an RGB color output (one for red, one for green and one
   for blue), with a Digital Equipment Corporation PDP 11/04 minicomputer
   controlling the three devices as one.

   In 1975, the UK company Quantel produced the first commercial
   full-color broadcast framebuffer, the Quantel DFS 3000. It was first
   used in TV coverage of the 1976 Montreal Olympics to generate a
   picture-in-picture inset of the Olympic flaming torch while the rest of
   the picture featured the runner entering the stadium.

   The rapid improvement of integrated-circuit technology made it possible
   for many of the home computers of the late 1970s to contain
   low-color-depth framebuffers. Today, nearly all computers with
   graphical capabilities utilize a framebuffer for generating the video
   signal. Amiga computers, created in the 1980s, featured special design
   attention to graphics performance and included a unique Hold-And-Modify
   framebuffer capable of displaying 4096 colors.

   Framebuffers also became popular in high-end workstations and arcade
   system boards throughout the 1980s. SGI, Sun Microsystems, HP, DEC and
   IBM all released framebuffers for their workstation computers in this
   period. These framebuffers were usually of a much higher quality than
   could be found in most home computers, and were regularly used in
   television, printing, computer modeling and 3D graphics. Framebuffers
   were also used by Sega for its high-end arcade boards, which were also
   of a higher quality than on home computers.

Display modes[edit]

   A Sun cgsix framebuffer

   Framebuffers used in personal and home computing often had sets of
   defined modes under which the framebuffer can operate. These modes
   reconfigure the hardware to output different resolutions, color depths,
   memory layouts and refresh rate timings.

   In the world of Unix machines and operating systems, such conveniences
   were usually eschewed in favor of directly manipulating the hardware
   settings. This manipulation was far more flexible in that any
   resolution, color depth and refresh rate was attainable - limited only
   by the memory available to the framebuffer.

   An unfortunate side-effect of this method was that the display device
   could be driven beyond its capabilities. In some cases, this resulted
   in hardware damage to the display.^[14] More commonly, it simply
   produced garbled and unusable output. Modern CRT monitors fix this
   problem through the introduction of protection circuitry. When the
   display mode is changed, the monitor attempts to obtain a signal lock
   on the new refresh frequency. If the monitor is unable to obtain a
   signal lock, or if the signal is outside the range of its design
   limitations, the monitor will ignore the framebuffer signal and
   possibly present the user with an error message.

   LCD monitors tend to contain similar protection circuitry, but for
   different reasons. Since the LCD must digitally sample the display
   signal (thereby emulating an electron beam), any signal that is out of
   range cannot be physically displayed on the monitor.

Color palette[edit]

   Framebuffers have traditionally supported a wide variety of color
   modes. Due to the expense of memory, most early framebuffers used 1-bit
   (2-colors per pixel), 2-bit (4-colors), 4-bit (16-colors) or 8-bit
   (256-colors) color depths. The problem with such small color depths is
   that a full range of colors cannot be produced. The solution to this
   problem was indexed color which adds a lookup table to the framebuffer.
   Each color stored in framebuffer memory acts as a color index. The
   lookup table serves as a palette with a limited number of different
   colors meanwhile the rest is used as an index table.

   Here is a typical indexed 256-color image and its own palette (shown as
   a rectangle of swatches):

   Adaptative 8bits palette sample image.png   Adaptative 8bits
   palette.png

   In some designs it was also possible to write data to the lookup table
   (or switch between existing palettes) on the run, allowing dividing the
   picture into horizontal bars with their own palette and thus render an
   image that had a far wider palette. For example, viewing an outdoor
   shot photograph, the picture could be divided into four bars, the top
   one with emphasis on sky tones, the next with foliage tones, the next
   with skin and clothing tones, and the bottom one with ground colors.
   This required each palette to have overlapping colors, but carefully
   done, allowed great flexibility.

Memory access[edit]

   While framebuffers are commonly accessed via a memory mapping directly
   to the CPU memory space, this is not the only method by which they may
   be accessed. Framebuffers have varied widely in the methods used to
   access memory. Some of the most common are:
     * Mapping the entire framebuffer to a given memory range.
     * Port commands to set each pixel, range of pixels or palette entry.
     * Mapping a memory range smaller than the framebuffer memory, then
       bank switching as necessary.

   The framebuffer organization may be packed pixel or planar. The
   framebuffer may be all points addressable or have restrictions on how
   it can be updated.

RAM on the video card[edit]

   See also: Video memory

   Video cards always have a certain amount of RAM. A small portion of
   this RAM is where the bitmap of image data is "buffered" for display.
   The term frame buffer is thus often used interchangeably when referring
   to this RAM.

   The CPU sends image updates to the video card. The video processor on
   the card forms a picture of the screen image and stores it in the frame
   buffer as a large bitmap in RAM. The bitmap in RAM is used by the card
   to continually refresh the screen image.^[15]

Virtual framebuffers[edit]

   Many systems attempt to emulate the function of a framebuffer device,
   often for reasons of compatibility. The two most common virtual
   framebuffers are the Linux framebuffer device (fbdev) and the X Virtual
   Framebuffer (Xvfb). Xvfb was added to the X Window System distribution
   to provide a method for running X without a graphical framebuffer. The
   Linux framebuffer device was developed to abstract the physical method
   for accessing the underlying framebuffer into a guaranteed memory map
   that is easy for programs to access. This increases portability, as
   programs are not required to deal with systems that have disjointed
   memory maps or require bank switching.

Page flipping[edit]

   A frame buffer may be designed with enough memory to store two frames
   worth of video data. In a technique known generally as double buffering
   or more specifically as page flipping, the framebuffer uses half of its
   memory to display the current frame. While that memory is being
   displayed, the other half of memory is filled with data for the next
   frame. Once the secondary buffer is filled, the framebuffer is
   instructed to display the secondary buffer instead. The primary buffer
   becomes the secondary buffer, and the secondary buffer becomes the
   primary. This switch is often done after the vertical blanking interval
   to avoid screen tearing where half the old frame and half the new frame
   is shown together.

   Page flipping has become a standard technique used by PC game
   programmers.

Graphics accelerators[edit]

   See also: Video card and Graphics processing unit

   As the demand for better graphics increased, hardware manufacturers
   created a way to decrease the amount of CPU time required to fill the
   framebuffer. This is commonly called graphics acceleration. Common
   graphics drawing commands (many of them geometric) are sent to the
   graphics accelerator in their raw form. The accelerator then rasterizes
   the results of the command to the framebuffer. This method frees the
   CPU to do other work.

   Early accelerators focused on improving the performance of 2D GUI
   systems. While retaining these 2D capabilities, most modern
   accelerators focus on producing 3D imagery in real time. A common
   design uses a graphics library such as OpenGL or Direct3D which
   interfaces with the graphics driver to translate received commands to
   instructions for the accelerator's graphics processing unit (GPU). The
   GPU uses those instructions to compute the rasterized results and the
   results are bit blitted to the framebuffer. The framebuffer's signal is
   then produced in combination with built-in video overlay devices
   (usually used to produce the mouse cursor without modifying the
   framebuffer's data) and any final special effects that are produced by
   modifying the output signal. An example of such final special effects
   was the spatial anti-aliasing technique used by the 3dfx Voodoo cards.
   These cards add a slight blur to output signal that makes aliasing of
   the rasterized graphics much less obvious.

   At one time there were many manufacturers of graphics accelerators,
   including: 3dfx Interactive; ATI; Hercules; Trident; Nvidia; Radius; S3
   Graphics; SiS and Silicon Graphics. As of 2015^[update] the market for
   graphics accelerators for x86-based systems is dominated by Nvidia
   (acquired 3dfx in 2002), AMD (who acquired ATI in 2006), and Intel.

Comparisons[edit]

   With a framebuffer, the electron beam (if the display technology uses
   one) is commanded to perform a raster scan, the way a television
   renders a broadcast signal. The color information for each point thus
   displayed on the screen is pulled directly from the framebuffer during
   the scan, creating a set of discrete picture elements, i.e. pixels.

   Framebuffers differ significantly from the vector displays that were
   common prior to the advent of raster graphics (and, consequently, to
   the concept of a framebuffer). With a vector display, only the vertices
   of the graphics primitives are stored. The electron beam of the output
   display is then commanded to move from vertex to vertex, tracing a line
   across the area between these points.

   Likewise, framebuffers differ from the technology used in early text
   mode displays, where a buffer holds codes for characters, not
   individual pixels. The video display device performs the same raster
   scan as with a framebuffer, but generates the pixels of each character
   in the buffer as it directs the beam.

See also[edit]

     * Bit plane
     * Scanline rendering
     * Swap chain
     * Tile-based video game
     * Tiled rendering

References[edit]

    1. ^ "What is frame buffer? A Webopedia Definition". webopedia.com.
       June 1998.
    2. ^ "Frame Buffer FAQ". Retrieved 14 May 2014.
    3. ^ Mueller, J. (2002). .NET Framework Solutions: In Search of the
       Lost Win32 API. Wiley. p. 160. ISBN 9780782141344. Retrieved
       2015-04-21.
    4. ^ "Smart Computing Dictionary Entry - video buffer". Archived from
       the original on 2012-03-24. Retrieved 2015-04-21.
    5. ^ ^a ^b Gaboury, J. (2018-03-01). "The random-access image: Memory
       and the history of the computer screen". Grey Room. 70 (70): 24-53.
       doi:10.1162/GREY_a_00233. hdl:21.11116/0000-0001-FA73-4.
       ISSN 1526-3819. S2CID 57565564.
    6. ^ Williams, F. C.; Kilburn, T. (March 1949). "A storage system for
       use with binary-digital computing machines". Proceedings of the IEE
       - Part III: Radio and Communication Engineering. 96 (40): 81-.
       doi:10.1049/pi-3.1949.0018.
    7. ^ "Kilburn 1947 Report Cover Notes (Digital 60)".
       curation.cs.manchester.ac.uk. Retrieved 2019-04-26.
    8. ^ D. Ophir; S. Rankowitz; B. J. Shepherd; R. J. Spinrad (June
       1968), "BRAD: The Brookhave Raster Display", Communications of the
       ACM, vol. 11, no. 6, pp. 415-416, doi:10.1145/363347.363385,
       S2CID 11160780
    9. ^ Noll, A. Michael (March 1971). "Scanned-Display Computer
       Graphics". Communications of the ACM. 14 (3): 145-150.
       doi:10.1145/362566.362567. S2CID 2210619.
   10. ^ ^a ^b Richard Shoup (2001). "SuperPaint: An Early Frame Buffer
       Graphics System" (PDF). Annals of the History of Computing. IEEE.
       Archived from the original (PDF) on 2004-06-12.
   11. ^ Goldwasser, S.M. (June 1983). Computer Architecture For
       Interactive Display Of Segmented Imagery. Computer Architectures
       for Spatially Distributed Data. Springer Science & Business Media.
       pp. 75-94 (81). ISBN 9783642821509.
   12. ^ Picture System (PDF), Evans & Sutherland, retrieved 2017-12-31
   13. ^ "History of the New York Institute of Technology Graphics Lab".
       Retrieved 2007-08-31.
   14. ^ http://tldp.org/HOWTO/XFree86-Video-Timings-HOWTO/overd.html
       XFree86 Video Timings HOWTO: Overdriving Your Monitor
   15. ^ "An illustrated Guide to the Video Cards". karbosguide.com.

     *

   Alvy Ray Smith (May 30, 1997). "Digital Paint Systems: Historical
   Overview" (PDF). Microsoft Tech Memo 14. Archived from the original
   (PDF) on February 7, 2012.

     Wayne Carlson (2003). "Hardware advancements". A Critical History of
   Computer Graphics and Animation. The Ohio State University. Archived
   from the original on 2012-03-14.

     Alvy Ray Smith (2001). "Digital Paint Systems: An Anecdotal and
   Historical Overview" (PDF). IEEE Annals of the History of Computing.
   Archived from the original (PDF) on 2012-02-05.

External links[edit]

     * Interview with NYIT researcher discussing the 24-bit system
     * History of Sun Microsystems' Framebuffers

     * v
     * t
     * e

   Graphics processing unit

   GPU

   Desktop
     * Intel
          + Arc
          + GT
     * Nvidia
          + GeForce
          + Quadro
          + Tesla
          + Tegra
     * AMD
          + Radeon
          + Radeon Pro
          + Radeon Instinct
     * Matrox
     * InfiniteReality
     * NEC uPD7220
     * 3dfx Voodoo
     * S3
     * Glaze3D
     * Apple silicon

   Mobile
     * Adreno
     * Apple silicon
     * Mali
     * PowerVR
     * VideoCore
     * Vivante
     * Imageon
     * Intel 2700G

   Architecture

     * Compute kernel
     * Fabrication
          + CMOS
          + FinFET
          + MOSFET
     * Graphics pipeline
          + Geometry
          + Vertex
     * HDR rendering
     * MAC
     * Rasterisation
          + Shading
     * Ray-tracing
     * SIMD
          + SIMT
     * Tessellation
     * T&L
     * Tiled rendering
     * Unified shader model

   Components

     * Blitter
     * Geometry processor
     * Input-output memory management unit
     * Render output unit
     * Shader unit
     * Stream processor
     * Tensor unit
     * Texture mapping unit
     * Video display controller
     * Video processing unit

   Memory

     * DMA
     * Framebuffer
     * SGRAM
          + GDDR
          + GDDR3
          + GDDR4
          + GDDR5
          + GDDR6
     * HBM
     * Memory bandwidth
     * Memory controller
     * Shared graphics memory
     * Texture memory
     * VRAM

   Form factor

     * IP core
     * Discrete graphics
          + Clustering
          + Switching
     * External graphics
     * Integrated graphics
     * System on a chip

   Performance

     * Clock rate
     * Display resolution
     * Fillrate
          + Pixel/s
          + Texel/s
     * FLOP/s
     * Frame rate
     * Performance per watt
     * Transistor count

   Misc

     * 2D
          + Scrolling
          + Sprite
          + Tile
     * 3D
          + GI
          + Texture
     * ASIC
     * GPGPU
     * Graphics library
     * Hardware acceleration
     * Image processing
          + Compression
     * Parallel computing
     * Vector processor
     * Video coding
          + Codec
     * VLIW

   Retrieved from
   "https://en.wikipedia.org/w/index.php?title=Framebuffer&oldid=111684696
   3"

   Categories:
     * Computer graphics
     * Computer memory
     * Image processing
     * User interfaces

   Hidden categories:
     * Articles with short description
     * Short description is different from Wikidata
     * All articles with specifically marked weasel-worded phrases
     * Articles with specifically marked weasel-worded phrases from July
       2017
     * All articles with unsourced statements
     * Articles with unsourced statements from August 2017
     * Articles containing potentially dated statements from 2015
     * All articles containing potentially dated statements

Navigation menu

Personal tools

     * Not logged in
     * Talk
     * Contributions
     * Create account
     * Log in

Namespaces

     * Article
     * Talk

   [ ] English

Views

     * Read
     * Edit
     * View history

   [ ] More

   ____________________ Search Go

Navigation

     * Main page
     * Contents
     * Current events
     * Random article
     * About Wikipedia
     * Contact us
     * Donate

Contribute

     * Help
     * Learn to edit
     * Community portal
     * Recent changes
     * Upload file

Tools

     * What links here
     * Related changes
     * Upload file
     * Special pages
     * Permanent link
     * Page information
     * Cite this page
     * Wikidata item

Print/export

     * Download as PDF
     * Printable version

Languages

     * a+l+e+r+b+y+tm
     * Catal`a
     * Cestina
     * Deutsch
     * Espanol
     * f+a+r+s+
     * Franc,ais
     *
     *
     * Hrvatski
     * Bahasa Indonesia
     * Italiano
     * E+B+R+J+T+
     * azasha
     * Nederlands
     * Polski
     * Portugues
     * Russkij
     * Srpski / srpski
     * Suomi
     * Tuerkc,e
     * Ukrayins'ka
     *

   Edit links

     * This page was last edited on 18 October 2022, at 17:02 (UTC).
     * Text is available under the Creative Commons Attribution-ShareAlike
       License 3.0; additional terms may apply. By using this site, you
       agree to the Terms of Use and Privacy Policy. Wikipedia(R) is a
       registered trademark of the Wikimedia Foundation, Inc., a
       non-profit organization.

     * Privacy policy
     * About Wikipedia
     * Disclaimers
     * Contact Wikipedia
     * Mobile view
     * Developers
     * Statistics
     * Cookie statement

     * Wikimedia Foundation
     * Powered by MediaWiki
---------------------------------------------------------------------------------------
Saved from web.archive.org, with Lynx.
 
Main page
 
© 2022 Matei. No cookies®