Physically based rendering: Difference between revisions

Content deleted Content added
No edit summary
Bender the Bot (talk | contribs)
m History: HTTP to HTTPS for Cornell University
 
(13 intermediate revisions by 11 users not shown)
Line 2:
[[File:Physically Based Rendering Sample 2.png|thumb|upright=1.2|A [[diamond plate]] texture rendered close-up using physically based rendering principles. [[Specular highlight#Microfacets|Microfacet]] abrasions cover the material, giving it a rough, realistic look even though the material is a [[metal]]. [[Specular highlight]]s are high and realistically modeled at the appropriate edge of the tread using a [[normal mapping|normal map]].]]
 
'''Physically based rendering''' ('''PBR''') is a [[computer graphics]] approach that seeks to [[3D rendering|render]] images in a way that models the lights and surfaces with [[optics]] in the real world. It is often referred to as "Physically Based Lighting" or "Physically Based Shading". Many PBR pipelines aim to achieve [[photorealism]]. Feasible and quick [[approximation]]s of the [[bidirectional reflectance distribution function]] and [[rendering equation]] are of mathematical importance in this field. [[Photogrammetry]] may be used to help discover and encode accurate optical properties of materials. PBR principles may be implemented in real-time applications using [[Shader]]s or offline applications using [[Ray tracing (graphics)|ray tracing]] or [[Path tracing|path tracing]].
 
==History==
 
Starting in the 1980s, a number of rendering researchers worked on establishing a solid theoretical basis for rendering, including physical correctness. Much of this work was done at the [[Cornell University]] Program of Computer Graphics; a 1997 paper from that lab<ref name=":0">{{cite journal |last1=Greenberg |first1=Donald P. |title=A framework for realistic image synthesis |journal=Communications of the ACM |date=1 August 1999 |volume=42 |issue=8 |pages=44–53 |doi=10.1145/310930.310970 |url=httphttps://www.graphics.cornell.edu/pubs/1997/GTS+97.pdf |url-status=live |archiveurl=https://web.archive.org/web/20180924033321/http://www.graphics.cornell.edu/pubs/1997/GTS+97.pdf |archivedate=24 September 2018 |access-date=27 November 2017 }}</ref> describes the work done at Cornell in this area to that point.
 
"Physically Based Shading" was introduced by [[Yoshiharu Gotanda]] during the course [https://renderwonk.com/publications/s2010-shading-course/ Physically-Based Shading Models in Film and Game Production] at the SIGGRAPH 2010. And followed by the course [https://blog.selfshadow.com/publications/ Physically Based Shading in Theory and Practice] organised by [[Stephen Hill (programmer)|Stephen Hill]] and [[Stephen McAuley]] between 2012 and 2020.
Line 12:
The phrase "Physically Based Rendering" was more widely popularized by [[Matt Pharr]], Greg Humphreys, and [[Pat Hanrahan]] in their book of the same name from 2004, a seminal work in modern computer graphics that won its authors a Technical Achievement [[Academy Award]] for [[special effects]].<ref name=":1">{{Cite book |last=Pharr |first=Matt |title=Physically Based Rendering: From Theory to Implementation |last2=Humphreys |first2=Greg |last3=Hanrahan |first3=Pat |publisher=Morgan Kaufmann |year=2004 |isbn=9780080538969 |edition=1st}}</ref> The book is now in its fourth edition.<ref name=":2">{{Cite book |last=Pharr |first=Matt |title=Physically Based Rendering: From Theory to Implementation |last2=Jakob |first2=Wenzel |last3=Humphreys |first3=Greg |publisher=The MIT Press |year=2023 |isbn=9780262048026 |edition=4th}}</ref>
 
The first successful, yet partial implementation of physically-based rendering in a video game can be found in the 2013 title [[Remember Me (video game)|Remember Me]], that despite being built on a game engine not natively supporting this technology ([[Unreal Engine#Unreal Engine 3|Unreal Engine 3)]]) was properly modified to accommodate this feature.<ref name=":3" /> Despite being a moderate approach to PBR, its accuracy has been further refined with posterior titles such as [[Ryse: Son of Rome]] and [[Killzone Shadow Fall]], released on the same year, until the current state of PBR advancements in the 2020s. <ref name=":4" /> <ref name=":5" />
 
==Process==
Line 53:
* [[OGRE]]
* [[Autodesk Maya|Maya]]
* [[LightWave 3D|LightWave]]
* [[Babylon.js]]
* Bevy
* [[Blender (software)|Blender]]
* [[Cinema 4D]]
Line 61 ⟶ 63:
* [[Godot (game engine)]]
* [[Houdini (software)|Houdini]] (SideFX)
* [[iClone]]
* [[JMonkey Engine|jME]]
* [[Microstation]]
Line 66 ⟶ 69:
* [[Rhinoceros 3D]]
* [[Roblox Studio]]
* [[Second Life]]
* [[Sketchfab]]
* [[Stride (game engine)|Stride]]
* [[Three.js]]
* [[Unigine]]
* [[Source 2 (game engine)]]
* [[Unity (game engine)|Unity]]
* [[Unreal Engine]]
* [[VTK]]
* [[Webots]]
A typical application provides an intuitive [[graphical user interface]] that allows artists to define and layer materials with arbitrary properties and to assign them to a given 2D or 3D object to recreate the appearance of any synthetic or organic material. Environments can be defined with procedural shaders or textures as well as procedural geometry or meshes or [[point cloud]]s.<ref name=":4">{{Cite web|url=https://help.sketchfab.com/hc/en-us/articles/209143806-Point-Clouds|title=Point Clouds|website=Sketchfab Help Center|language=en-US|access-date=2018-05-29}}</ref> If possible all changes are made visible in real-time and therefore allow for quick iterations. Sophisticated applications allow savvy users to write custom shaders in a [[shading language]] such as [[HLSL]] or [[GLSL]], though increasingly node-based material editors that allow a graph-based workflow with native support for important concepts such as light position, levels of reflection and emission and metallicity, and a wide range of other math and optics functions are replacing hand-written shaders for all but the most complex applications.