Content deleted Content added
→Real-time software rendering: Fix date of cited interview, which was off by 4 years |
|||
(37 intermediate revisions by 33 users not shown) | |||
Line 1:
{{Short description|Generating images by computer software}}
{{Multiple issues|
{{refimprove|date=July 2007}}
{{confusing|date=June 2008}}
{{update|date=July 2025}}
}}
[[File:Software renderer embedded.gif|thumb|right|Software renderer running on a device without a [[Graphics processing unit|GPU]]]]
'''Software rendering''' is the process of generating an image from a model by means of computer software. In the context of [[rendering (computer graphics)|computer graphics rendering]], software rendering refers to a rendering process that is not dependant upon [[graphics hardware]] [[Application-specific integrated circuit|ASICs]], such as a [[graphics card]]. The rendering takes place entirely in the [[Central processing unit|CPU]]. Rendering everything with the (general-purpose) CPU has the main advantage that it is not restricted to the (limited) capabilities of graphics hardware, but the disadvantage that more semiconductors are needed to obtain the same speed.▼
▲'''Software rendering''' is the process of generating an image from a model by means of computer software. In the context of [[rendering (computer graphics)|computer graphics rendering]], software rendering refers to a rendering process that is not
Rendering is used in architecture, simulators, video games, movies and television visual effects and design visualization. Rendering is the last step in an animation process, and gives the final appearance to the models and animation with visual effects such as shading, texture-mapping, shadows, reflections and motion blurs.<ref>{{Cite web|url=http://usa.autodesk.com/adsk/servlet/item?id=17940930&siteID=123112|title=LIVE Design - Interactive Visualizations {{!}} Autodesk|access-date=2016-08-20}}</ref> Rendering can be split into two main categories: [[real-time rendering]] (also known as online rendering), and pre-rendering (also called offline rendering). Real-time rendering is used to interactively render a scene, like in [[3D computer game]]s, and generally each frame must be rendered in a few miliseconds. Offline rendering is used to create realistic images and movies, where each frame can take hours or days to complete, or for debugging of complex graphics code by programmers.▼
▲Rendering is used in architecture, simulators, video games, movies and television visual effects and design visualization. Rendering is the last step in an animation process, and gives the final appearance to the models and animation with visual effects such as shading, texture-mapping, shadows,
==Real-time software rendering==
For real-time rendering the focus is on performance. The earliest texture mapped real-time software renderers for PCs used many tricks to create the illusion of 3D geometry ([[true 3D]] was limited to flat or [[Gouraud shading|Gouraud-shaded]] [[polygon]]s employed mainly in [[flight simulator]]s.) ''[[Ultima Underworld]]'', for example, allowed a limited form of looking up and down, slanted floors, and rooms over rooms, but resorted to [[sprite (computer graphics)|sprites]] for all detailed objects. The technology used in these games is currently categorized as [[2.5D]].
One of the first games architecturally similar to modern 3D titles, allowing full [[six degrees of freedom|6DoF]], was ''[[Descent (video game)|Descent]]'', which featured [[3D model]]s entirely made from bitmap [[texture mapping|textured]] triangular polygons. [[Voxel]]-based graphics also gained popularity for fast and relatively detailed terrain rendering, as in ''[[Delta Force (video game)|Delta Force]]'', but popular [[fixed-function]] hardware eventually made its use impossible. ''[[Quake (video game)|Quake]]'' features an efficient software renderer by [[Michael Abrash]] and [[John D. Carmack|John Carmack]]. With its popularity, ''Quake'' and other polygonal 3D games of that time helped the sales of [[Video card|graphics cards]], and more games started using hardware [[API]]s like [[DirectX]] and [[OpenGL]]. Though software rendering fell off as a primary rendering technology, many games well into the 2000s still had a software renderer as a fallback, ''[[Unreal (1998 video game)|Unreal]]'' and ''[[Unreal Tournament]]'' for instance, feature software renderers able to produce enjoyable quality and performance on
In the [[video game console]] and [[arcade game]] markets, the evolution of 3D was more abrupt, as they had always relied heavily on single-purpose chipsets. 16 bit consoles gained RISC accelerator cartridges in games such as ''[[StarFox]]'' and ''[[Virtua Racing]]'' which implemented software rendering through tailored instruction sets. The [[Atari Jaguar|Jaguar]] and [[3DO Interactive Multiplayer|3DO]] were the first consoles to ship with 3D hardware, but it wasn't until the [[PlayStation (console)|PlayStation]] that such features came to be used in most games.
Games for children and casual gamers (who use outdated systems or systems primarily meant for office applications) during the late 1990s to early 2000s typically used a software renderer as a fallback. For example, ''[[Toy Story 2: Buzz Lightyear to the Rescue]]'' has a choice of selecting either hardware or software rendering before playing the game, while others like ''[[Half-Life_(video_game)|Half-Life]]'' default to software mode and can be adjusted to use OpenGL or DirectX in the Options menu. Some 3D modeling software also
But even for high-end graphics, the 'art' of software rendering hasn't completely died out. While early graphics cards were much faster than software renderers and originally had better quality and more features, it restricted the developer to 'fixed-function' pixel processing. Quickly there came a need for diversification of the looks of games. Software rendering has no restrictions because an arbitrary program is executed. So graphics cards reintroduced this programmability, by executing small programs per [[vertex (geometry)|vertex]] and per [[pixel]]/[[fragment (computer graphics)|fragment]], also known as [[shaders]]. Shader languages, such as [[High Level Shader Language]] (HLSL) for DirectX or the [[OpenGL Shading Language]] (GLSL), are [[C (programming language)|C]]-like programming languages for shaders and start to show some resemblance with (arbitrary function) software rendering.
Since the adoption of graphics hardware as the primary means for real-time rendering, CPU performance has grown steadily as ever. This allowed for new software rendering technologies to emerge. Although largely overshadowed by the performance of hardware rendering, some modern real-time software renderers manage to combine a broad feature set and reasonable performance (for a software renderer), by making use of specialized [[dynamic compilation]] and advanced instruction set extensions like [[Streaming SIMD Extensions|SSE]]. Although nowadays the dominance of hardware rendering over software rendering is undisputed because of unparalleled performance, features, and continuing innovation, some believe that CPUs and [[GPU]]s will converge one way or another and the line between software and hardware rendering will fade.<ref>{{Cite web|last=Valich|first=Theo|date=11 March 2008|title=Tim Sweeney, Part 2: "DirectX 10 is the last relevant graphics API" {{!}} TG Daily|url=https://www.tgdaily.com/business/tim-sweeney-part-2-directx-10-is-the-last-relevant-graphics-api/|url-status=live|archive-url=https://web.archive.org/web/20160304145146/http://www.tgdaily.com/business-and-law-features/36410-tim-sweeney-part-2-directx-10-is-the-last-relevant-graphics-api|
===Software fallback===
Line 27 ⟶ 31:
* [[RAD Game Tools]]' Pixomatic, sold as middleware intended for static linking inside D3D 7–9 client software.
* [[TransGaming Inc.#SwiftShader|SwiftShader]], a library sold as middleware intended for bundling with D3D9 & OpenGL ES 2 client software.
* The swrast, softpipe, & LLVMpipe renderers inside [[Mesa (computer graphics)|Mesa]] work as a shim at the system level to emulate an OpenGL 1.4–3.2 hardware device. The lavapipe renderer also featured in Mesa provides software rendering for the [[Vulkan]] API.
* [[Windows Advanced Rasterization Platform|WARP]], provided since Windows Vista by Microsoft, which works at the system level to provide fast D3D 9.1 and above emulation. This is in addition to the extremely slow software-based reference rasterizer Microsoft has always provided to developers.
* The Apple software renderer in [[Core OpenGL|CGL]], provided in Mac OS X by Apple, which works at the system level to provide fast OpenGL 1.1–4.1 emulation.
==Pre-rendering==
Contrary to real-time rendering, performance is only of second priority with pre-rendering. It is used mainly in the film industry to create high-quality renderings of lifelike scenes. Many [[special effects]] in today's movies are entirely or partially created by computer graphics. For example, the character of [[Gollum]] in the [[Peter Jackson]] ''[[The Lord of the Rings (film series)|The Lord of the Rings]]'' films is made completely of [[computer-generated imagery]] (CGI). Also for [[animation]] movies, CGI is gaining popularity. Most notably [[Pixar]] has produced a series of movies such as ''[[Toy Story]]'' and ''[[Finding Nemo]]'', and the [[Blender Foundation]] the world's first [[Blender (software)#Open projects|open movie]], ''[[Elephants Dream]]''.
Because of the need for very high-quality and diversity of effects, offline rendering requires a lot of flexibility. Even though commercial real-time graphics hardware is
==See also==
* [[3D computer graphics]]
* [[Headless software]]
* [[Z-buffering]]
==References==
Line 43 ⟶ 49:
{{DEFAULTSORT:Software Rendering}}
{{computer graphics}}
[[Category:3D rendering]]
|