Software rendering: Difference between revisions

Content deleted Content added
Software fallback: Expanding article
KaiaVintr (talk | contribs)
Real-time software rendering: Fix date of cited interview, which was off by 4 years
 
(2 intermediate revisions by 2 users not shown)
Line 3:
{{refimprove|date=July 2007}}
{{confusing|date=June 2008}}
{{update|date=July 2025}}
}}
 
Line 22 ⟶ 23:
But even for high-end graphics, the 'art' of software rendering hasn't completely died out. While early graphics cards were much faster than software renderers and originally had better quality and more features, it restricted the developer to 'fixed-function' pixel processing. Quickly there came a need for diversification of the looks of games. Software rendering has no restrictions because an arbitrary program is executed. So graphics cards reintroduced this programmability, by executing small programs per [[vertex (geometry)|vertex]] and per [[pixel]]/[[fragment (computer graphics)|fragment]], also known as [[shaders]]. Shader languages, such as [[High Level Shader Language]] (HLSL) for DirectX or the [[OpenGL Shading Language]] (GLSL), are [[C (programming language)|C]]-like programming languages for shaders and start to show some resemblance with (arbitrary function) software rendering.
 
Since the adoption of graphics hardware as the primary means for real-time rendering, CPU performance has grown steadily as ever. This allowed for new software rendering technologies to emerge. Although largely overshadowed by the performance of hardware rendering, some modern real-time software renderers manage to combine a broad feature set and reasonable performance (for a software renderer), by making use of specialized [[dynamic compilation]] and advanced instruction set extensions like [[Streaming SIMD Extensions|SSE]]. Although nowadays the dominance of hardware rendering over software rendering is undisputed because of unparalleled performance, features, and continuing innovation, some believe that CPUs and [[GPU]]s will converge one way or another and the line between software and hardware rendering will fade.<ref>{{Cite web|last=Valich|first=Theo|date=2012-12-1311 March 2008|title=Tim Sweeney, Part 2: "DirectX 10 is the last relevant graphics API" {{!}} TG Daily|url=https://www.tgdaily.com/business/tim-sweeney-part-2-directx-10-is-the-last-relevant-graphics-api/|url-status=live|archive-url=https://web.archive.org/web/20160304145146/http://www.tgdaily.com/business-and-law-features/36410-tim-sweeney-part-2-directx-10-is-the-last-relevant-graphics-api|archive-date=March 4, 2016|access-date=2016-11-07|website=TG Daily}}</ref>
 
===Software fallback===
Line 35 ⟶ 36:
 
==Pre-rendering==
Contrary to real-time rendering, performance is only of second priority with pre-rendering. It is used mainly in the film industry to create high-quality renderings of lifelike scenes. Many [[special effects]] in today's movies are entirely or partially created by computer graphics. For example, the character of [[Gollum]] in the [[Peter Jackson]] ''[[The Lord of the Rings (film series)|The Lord of the Rings]]'' films is made completely of [[computer-generated imagery]] (CGI). Also for [[animation]] movies, CGI is gaining popularity. Most notably [[Pixar]] has produced a series of movies such as ''[[Toy Story]]'' and ''[[Finding Nemo]]'', and the [[Blender Foundation]] the world's first [[Blender (software)#Open projects|open movie]], ''[[Elephants Dream]]''.
 
Because of the need for very high-quality and diversity of effects, offline rendering requires a lot of flexibility. Even though commercial real-time graphics hardware is becoming higher in quality and more programmable by the day, most [[photorealistic rendering|photorealistic]] CGI still requires software rendering. Pixar's [[RenderMan (software)|RenderMan]], for example, allows shaders of unlimited length and complexity, demanding a general-purpose processor. Older hardware is also incapable of techniques for high realism like [[Ray tracing (graphics)|raytracing]] and [[global illumination]].