History of computer animation: Difference between revisions

Content deleted Content added
subjective ce
Osaka University: bare url corrected
 
(18 intermediate revisions by 14 users not shown)
Line 36:
 
===The University of Utah===
[[University of Utah|Utah]] was a major center for computer animation in this period. The computer science faculty was founded by [[David C. Evans (computer scientist)|David Evans]] in 1965, and many of the basic techniques of 3-D computer graphics were developed here in the early 1970s with [[DARPA|ARPA]] funding (''Advanced Research Projects Agency''). Research results included Gouraud, Phong, and Blinn shading, texture mapping, [[hidden-surface determination|hidden surface]] algorithms, curved [[subdivision surface|surface subdivision]], real-time line-drawing and raster image display hardware, and early virtual reality work.<ref>[http://www.cs.utah.edu/gdc/history/ Utah – Computer Graphics history] (retrieved 2012/04/22)</ref> In the words of Robert Rivlin in his 1986 book ''The Algorithmic Image: Graphic Visions of the Computer Age'', "almost every influential person in the modern computer-graphics community either passed through the University of Utah or came into contact with it in some way".<ref>The algorithmic image: graphic visions of the computer age, ''Harper & Row Publishers, Inc.'' New York, NY, USA 1986. {{ISBN|0914845802}}</ref>
 
==== Shaded 3D graphics ====
[[File:1967 512x512 Cube Rendering at Univ of Utah.png|thumb|An image of a cube generated at the University of Utah in 1967]]
In the mid-1960s, one of the most difficult problems in computer graphics was the [[Hidden-line removal|"hidden-line" problem]] – how to render a 3D model while properly removing the lines that should not be visible to the observer.<ref>{{Cite book |url=https://bitsavers.org/magazines/Datamation/196605.pdf |title=Datamation |date=May 1966 |pages=22–29}}</ref> One of the first successful approaches to this was published at the 1967 [[Fall Joint Computer Conference]] by Chris Wylie, David Evans, and Gordon Romney, and demonstrated shaded 3D objects such as cubes and [[Tetrahedron|tetrahedra]].<ref>{{Cite book |last1=Wylie |first1=Chris |last2=Romney |first2=Gordon |last3=Evans |first3=David |last4=Erdahl |first4=Alan |chapter=Half-tone perspective drawings by computer |date=1967-11-14 |title=Proceedings of the November 14-16, 1967, fall joint computer conference on - AFIPS '67 (Fall) |chapter-url=https://dl.acm.org/doi/10.1145/1465611.1465619 |___location=New York, NY, USA |publisher=Association for Computing Machinery |pages=49–58 |doi=10.1145/1465611.1465619 |isbn=978-1-4503-7896-3}}</ref> An improved version of this algorithm was demonstrated in 1968, including shaded renderings of 3D text, spheres, and buildings.<ref>{{Citation |last1=Romney |first1=Gordon W. |title=Real-time display of computer generated half-tone perspective pictures |date=1998-07-01 |work=Seminal graphics: pioneering efforts that shaped the field, Volume 1 |volume=1 |pages=283–288 |url=https://dl.acm.org/doi/10.1145/280811.281011 |access-date= |place=New York, NY, USA |publisher=Association for Computing Machinery |doi=10.1145/280811.281011 |isbn=978-1-58113-052-2 |last2=Watkins |first2=Gary S. |last3=Evans |first3=David C.}}</ref>
 
A shaded 3D computer animation of a colored [[Soma cube]] exploding into pieces was created at the University of Utah as part of Gordon Romney's 1969 PhD dissertation, along with shaded renderings of 3D text, 3D graphs, trucks, ships, and buildings.<ref>{{Cite book |last=Gordon W. Romney |url=https://archive.org/details/computerassisted0000unse_p7o2 |title=Computer Assisted Assembly and Rendering of Solids |date=August 1969 |publisher=University of Utah, Computer Science Dept. |others=Internet Archive}}</ref> This paper also coined the term "rendering" in reference to computer drawings of 3D objects. Another 3D shading algorithm was implemented by [[John Warnock]] for his 1969 dissertation.<ref>{{Cite thesis |last=Warnock |first=John Edward |title=A hidden surface algorithm for computer generated halftone pictures |date=June 1969 |degree=PhD |publisher=The University of Utah |url=https://dl.acm.org/doi/book/10.5555/905316 |doi=}}</ref>
[[File:1970 Church Rendering by Watkins at Univ of Utah.png|thumb|A color image of a church generated by the Watkins algorithm at the University of Utah in 1970]]
A truly real-time shading algorithm was developed by Gary Watkins for his 1970 PhD dissertation, and was the basis of the [[Gouraud shading]] technique, developed the following year.<ref>{{Cite book |last=Watkins |first=Gary |url=https://bitsavers.org/pdf/univOfUtah/UTECH-CSc-70-101_Watkins_Dissertation_Jun70.pdf |title=A real-time visible surface algorithm |date=June 1970 |publisher=The University of Utah}}</ref><ref>{{Cite thesis |last=Gouraud |first=Henri |title=Computer display of curved surfaces |date=1971 |degree=PhD |publisher=The University of Utah |url=https://dl.acm.org/doi/book/10.5555/905323 |doi=}}</ref> Robert Mahl's 1970 dissertation at the University of Utah described smooth shading of [[quadric surface]]s.<ref>{{Cite thesis |last=Mahl |first=Robert |url=https://collections.lib.utah.edu/details?id=704102 |title=Visible surface algorithms for quadric patches |date=December 1970 |publisher=The University of Utah}}</ref>
 
Further innovations in shaded 3D graphics at the University of Utah included a more realistic shading technique by [[Bui Tuong Phong]] for his dissertation in 1973 and texture mapping by [[Edwin Catmull]] for his 1974 dissertation.<ref>{{Cite book |last=Phong |first=Bui Tuong |url=https://collections.lib.utah.edu/details?id=712686 |title=Illimunation of computer generated images |date=July 1973 |publisher=The University of Utah}}</ref><ref>{{Cite thesis |last=Catmull |first=Edwin Earl |url=https://collections.lib.utah.edu/details?id=2111909 |title=A subdivision algorithm for computer display of curved surfaces |date=December 1974 |publisher=The University of Utah}}</ref>
 
==== Virtual reality ====
Around 1972, a [[virtual reality headset]] known as the "Sorcerer's Apprentice" became operational at the University of Utah, which used [[head tracking]] and a device similar to [[Massachusetts Institute of Technology|MIT]]'s Lincoln Wand to track the user's hand in 3D space.<ref>{{Cite thesis |url=https://collections.lib.utah.edu/details?id=706529 |title=Graphical man/machine communications |date=December 1972 |publisher=The University of Utah |last1=Evans |first1=David }}</ref> This headset, like Ivan Sutherland's [[The Sword of Damocles (virtual reality)|"Sword of Damocles"]], was capable of simple, unshaded [[Wire-frame model|wireframe]] 3D graphics; however, the Sorcerer's Apprentice added the capability to create and manipulate 3D objects in real-time through the hand tracking device, termed the "wand". Commands to be performed by the 3D wand could be chosen by pointing the wand at a physical wall chart.<ref>{{Cite thesis |last=Vickers |first=Donald Lee |url=https://collections.lib.utah.edu/details?id=705942 |title=Sorcerer's apprentice: head-mounted display and wand |date=July 1974 |publisher=The University of Utah}}</ref>
 
==== Character rigging and keyframing ====
An important innovation in computer animation at the University of Utah was the creation of the program "KEYFRAME", which would allow a user to pose and [[Key frame|keyframe]] a [[Character rigging|rigged]] humanoid 3D character, create [[walk cycle]]s and other movements, [[Lip sync|lip-sync]] the character, all using a [[Computer mouse|mouse]]-based [[Graphical user interface|graphical interface]], and then render a shaded animation of the rigged character performing the walk cycle, hand movement, or other animation. This program, as well as one for creating a 3D animation of a football match, were created by Barry Wessler for his 1973 PhD dissertation.<ref>{{Cite book |last=Wessler |first=Barry David |url=https://collections.lib.utah.edu/details?id=712684 |title=Computer-assisted visual communication |date=July 1973 |publisher=The University of Utah}}</ref> The capabilities of the "KEYFRAME" program were demonstrated in a short film, ''Not Just Reality'', which featured walk cycles, lip syncing, facial expressions, and further movement of a shaded humanoid 3D character.<ref>{{Cite AV media |url=https://www.youtube.com/watch?v=0sl72MD6Ycc |title=Not Just Reality |date=2023-03-19 |last=jellyvista |access-date=2025-01-06 |via=YouTube}}</ref>
 
===Evans and Sutherland===
Line 54 ⟶ 70:
 
===National Film Board of Canada===
The [[National Film Board of Canada]], already a world center for animation art, also began experimentation with computer techniques in 1969.<ref>"Retired NRC Scientists Burtnyk and Wein honoured as Fathers of Computer Animation Technology in Canada". ''Sphere'' (National Research Council of Canada) 4. 1996. (Retrieved April 20, 2011).</ref> Most well-known of the early pioneers with this was artist [[Peter Foldes]], who completed ''Metadata'' in 1971. This film comprised drawings animated by gradually changing from one image to the next, a technique known as "interpolating" (also known as "inbetweening" or "morphing"), which also featured in a number of earlier art examples during the 1960s.<ref name="NFBC-NRC">From [http://design.osu.edu/carlson/history/tree/nfbc.html "The Film Animator Today: Artists Without A Canvas"] {{Webarchive|url=https://web.archive.org/web/20120402221929/http://design.osu.edu/carlson/history/tree/nfbc.html |date=April 2, 2012 }} (retrieved April 22, 2012)</ref> In 1974, Foldes completed ''[[Hunger (1974 film)|Hunger / La Faim]]'', which was one of the first films to show solid filled (raster scanned) rendering, and was awarded the Jury Prize in the short film category at [[1974 Cannes Film Festival]], as well as an Academy Award nomination. Foldes and the National Film Board of Canada employed pioneering keyframe computer technology developed at the [[National Research Council Canada|National Research Council]] of Canada (NRC) by scientist Nestor Burtnyk in 1969. Burtnyk and his collaborator Marceli Wein received the Academy Award in 1997 in recognition of their role in the field.<ref>{{Cite news |last=Deachman |first=Bruce |date=August 31, 2018 |title=And the Oscar goes to...: Ottawa scientists were pioneers in animation technology |url=https://ottawacitizen.com/news/local-news/and-the-oscar-goes-to-ottawa-scientists-were-pioneers-in-animation-technology |access-date=April 20, 2025 |work=Ottawa Citizen}}</ref> The NRC team also contributed high-profile animation sequences to the celebrated BBC documentary series The Ascent of Man (1973).<ref>{{Cite web |last=National Research Council staff |date=October 20, 2015 |title=Computer Animation - An Oscar Winning Performance |url=https://ingeniumcanada.org/channel/innovation/computer-animation-oscar-winning-performance |access-date=April 20, 2025 |website=Ingenium Channel}}</ref>
 
===Atlas Computer Laboratory and Antics===
Line 74 ⟶ 90:
The first use of 3-D wireframe imagery in mainstream cinema was in the sequel to ''Westworld'', ''[[Futureworld]]'' (1976), directed by Richard T. Heffron. This featured a computer-generated hand and face created by University of Utah graduate students [[Edwin Catmull]] and [[Fred Parke]] which had initially appeared in their 1972 experimental short ''[[A Computer Animated Hand]].''<ref name="sltrib">{{cite news|url=http://www.sltrib.com/sltrib/mobile/53193670-90/film-catmull-computer-animation.html.csp|title=Pixar founder's Utah-made ''Hand'' added to National Film Registry|work=[[The Salt Lake Tribune]]|date=December 28, 2011|access-date=January 8, 2012}}</ref> The same film also featured snippets from 1974 experimental short ''Faces and Body Parts''. The [[Academy Awards|Academy Award]]-winning 1975 short animated film ''[[Great (1975 film)|Great]]'', about the life of the [[Victorian era|Victorian]] engineer [[Isambard Kingdom Brunel]], contains a brief sequence of a rotating wireframe model of Brunel's final project, the iron steam ship [[SS Great Eastern]].The third film to use this technology was ''[[Star Wars (film)|Star Wars]]'' (1977), written and directed by [[George Lucas]], with wireframe imagery in the scenes with the Death Star plans, the targeting computers in the [[X-wing]] fighters, and the ''[[Millennium Falcon]]'' spacecraft.
 
The [[Walt Disney Productions|Walt Disney]] film ''[[The Black Hole (1979 film)|The Black Hole]]'' (1979, directed by Gary Nelson) used wireframe rendering to depict the titular black hole, using equipment from Disney's engineers. In the same year, the science-fiction horror film ''[[Alien (film)|Alien]]'', directed by [[Ridley Scott]], also used wire-frame model graphics, in this case to render the navigation monitors in the spaceship. The footage was produced by Colin Emmett at the Atlas Computer Laboratory.<ref>[http://www.chilton-computing.org.uk/acl/applications/animation/p014.htm "My Work on the Alien", Bryan Wyvill] (retrieved June 30, 2012)</ref>
 
===Nelson Max===
Line 101 ⟶ 117:
 
===JPL and Jim Blinn===
Bob Holzman of [[NASA]]'s [[Jet Propulsion Laboratory]] in California established JPL's Computer Graphics Lab in 1977 as a group with technology expertise in visualizing data being returned from NASA missions. On the advice of Ivan Sutherland, Holzman hired a graduate student from Utah named [[Jim Blinn]].<ref>{{Cite journal |last=Holzman |first=Robert E. |date=1986-07-01 |title=Atoms to astronomy: Computer graphics at the Jet Propulsion Laboratory |url=https://doi.org/10.1007/BF01900326 |journal=The Visual Computer |language=en |volume=2 |issue=3 |pages=159–163 |doi=10.1007/BF01900326 |s2cid=2265857 |issn=1432-2315|url-access=subscription }}</ref><ref>Sutherland once allegedly commented that "There are about a dozen great computer graphics people, and Jim Blinn is six of them."</ref> Blinn had worked with imaging techniques at Utah, and developed them into a system for NASA's visualization tasks. He produced a series of widely seen "fly-by" simulations, including the [[Voyager program|Voyager]], [[Pioneer program|Pioneer]] and [[Galileo (spacecraft)|Galileo]] spacecraft fly-bys of Jupiter, Saturn and their moons. He also worked with [[Carl Sagan]], creating animations for his ''[[Cosmos: A Personal Voyage]]'' TV series. Blinn developed many influential new modelling techniques, and wrote papers on them for the [[IEEE]] (Institute of Electrical and Electronics Engineers), in their journal ''Computer Graphics and Applications''. Some of these included environment mapping, improved highlight modelling, "blobby" modelling, simulation of wrinkled surfaces, and simulation of butts and dusty surfaces.
 
Later in the 1980s, Blinn developed CGI animations for an [[Annenberg Foundation|Annenberg/CPB]] TV series, ''[[The Mechanical Universe]]'', which consisted of over 500 scenes for 52 half-hour programs describing physics and mathematics concepts for college students. This he followed with production of another series devoted to mathematical concepts, called ''[[Project Mathematics!]]''.<ref>[http://design.osu.edu/carlson/history/tree/jpl.html Jet Propulsion Lab (JPL) by Wayne Carlson] {{Webarchive|url=https://web.archive.org/web/20150724105628/http://design.osu.edu/carlson/history/tree/jpl.html |date=July 24, 2015 }} (retrieved July 3, 2012)</ref>
Line 125 ⟶ 141:
 
===Osaka University===
In 1982, Japan's [[Osaka University]] developed the [[Supercomputing in Japan|LINKS-1 Computer Graphics System]], a [[supercomputer]] that used up to 257 [[Zilog Z8000|Zilog Z8001]] [[microprocessor]]s, used for rendering realistic [[3D computer graphics|3D]] [[computer graphics]]. According to the Information Processing Society of Japan: "The core of 3D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, [[Computer graphics lighting|light source]], and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using [[Ray tracing (graphics)|ray tracing]]. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images." It was "used to create the world's first 3D [[planetarium]]-like video of the entire [[Universe|heavens]] that was made completely with computer graphics. The video was presented at the [[Fujitsu]] pavilion at the 1985 International Exposition in [[Tsukuba, Ibaraki|Tsukuba]]."<ref>{{Cite web | url=http://museum.ipsj.or.jp/en/computer/other/0013.html | title=LINKS-1 Computer Graphics System-Computer Museum}}</ref> The LINKS-1 was the world's most powerful computer, as of 1984.<ref>{{cite book | last=Defanti | first=Thomas A. | title=Advances in Computers | chapter=The Mass Impact of Videogame Technology | publisher=Elsevier | volume=23 | date=1984 | isbn=978-0-12-012123-6 | doi=10.1016/s0065-2458(08)60463-5 | doi-access=free | url=http://www.vasulka.org/archive/Writings/VideogameImpact.pdf#page=29 {{Bare URL PDF| access-date=March2025-08-30 2022| page=93–140}}</ref>
 
===3-D Fictional Animated Films at the University of Montreal===
Line 137 ⟶ 153:
 
===Sun Microsystems, Inc===
The [[Sun Microsystems]] company was founded in 1982 by [[Andy Bechtolsheim]] with other fellow graduate students at [[Stanford University]]. Bechtolsheim originally designed the SUN computer as a personal [[Computer-aided design|CAD]] workstation for the Stanford University Network (hence the acronym "SUN"). It was designed around the Motorola 68000 processor with the Unix operating system and virtual memory, and, like SGI, had an embedded frame buffer.<ref>[ftp://reports.stanford.edu/pub/cstr/reports/csl/tr/82/229/CSL-TR-82-229.pdf "The SUN Workstation Architecture"]{{dead link|date=November 2017May 2025|bot=InternetArchiveBot medic}}{{cbignore|fix-attemptedbot=yes medic}}, Andreas Bechtolsheim, Forest Baskett, Vaughan Pratt, March 1982, ''Stanford University Computer systems Laboratory Technical Report No. 229'' (retrieved July 28, 2009).</ref> Later developments included computer servers and workstations built on its own RISC-based processor architecture and a suite of software products such as the Solaris operating system, and the Java platform. By the '90s, Sun workstations were popular for rendering in 3-D CGI filmmaking—for example, [[Disney]]-[[Pixar]]'s 1995 movie ''[[Toy Story]]'' used a [[render farm]] of 117 Sun workstations.<ref>[[Toy Story#Animation|Animation and Rendering on ''Toy Story'']]</ref> Sun was a proponent of [[Open system (computing)|open systems]] in general and [[Unix]] in particular, and a major contributor to [[open source software]].<ref>[http://www.stanford.edu/group/wellspring/sun_spotlight.html "Wellspring of Innovation: Sun Microsystems Spotlight"] {{Webarchive|url=https://web.archive.org/web/20090517063315/http://www.stanford.edu/group/wellspring/sun_spotlight.html |date=May 17, 2009 }} Stanford.edu (retrieved July 28, 2009).</ref>
 
===National Film Board of Canada===
Line 204 ⟶ 220:
The following years saw a greatly increased uptake of digital animation techniques, with many new studios going into production, and existing companies making a transition from traditional techniques to CGI. Between 1995 and 2005 in the US, the average effects budget for a wide-release feature film leapt from $5&nbsp;million to $40&nbsp;million. According to Hutch Parker, President of Production at [[20th Century Fox]], {{As of|2005|lc=on}}, "50 percent of feature films have significant effects. They're a character in the movie." However, CGI has made up for the expenditures by grossing over 20% more than their real-life counterparts, and by the early 2000s, computer-generated imagery had become the dominant form of special effects.<ref>[https://www.wired.com/wired/archive/13.02/fxgods.html "F/X Gods" by Anne Thompson, Wired.com] (retrieved August 3, 2012).</ref>
 
[[Warner Bros. Animation|Warner Bros]]' 1999 ''[[The Iron Giant]]'' was the first traditionally- animated feature to have a major character, the title character, to be fully CGI.<ref>{{Citation|title=The Iron Giant (1999) – IMDb|url=http://www.imdb.com/title/tt0129167/trivia|access-date=March 30, 2021}}</ref>
 
===Motion-capture===
Line 211 ⟶ 227:
Computer-based motion-capture started as a [[photogrammetric]] analysis tool in [[biomechanics]] research in the 1970s and 1980s.<ref>{{cite journal| doi=10.1016/j.cub.2005.08.016 | pmid=16111929 | volume=15 | issue=16 | title=Mechanics of animal movement | year=2005 | journal=Current Biology | pages=R616–R619 | last1 = Alexander | first1 = R. McNeill| s2cid=14032136 | doi-access=free }}</ref> A performer wears markers near each joint to identify the motion by the positions or angles between the markers. Many different types of markers can be used—lights, reflective markers, LEDs, infra-red, inertial, mechanical, or wireless RF—and may be worn as a form of suit, or attached direct to a performer's body. Some systems include details of face and fingers to capture subtle expressions, and such is often referred to as "[[performance-capture]]". The computer records the data from the markers, and uses it to animate digital character models in 2-D or 3-D computer animation, and in some cases this can include camera movement as well. In the 1990s, these techniques became widely used for visual effects.
 
Video games also began to use motion-capture to animate in-game characters. As early as 1988, an early form of motion-capture was used to animate the [[2D computer graphics|2-D]] main character of the [[Martech]] video game ''[[Vixen (video game)|Vixen]]'', which was performed by model [[Corinne Russell]].<ref>{{cite magazine|magazine=[[Retro Gamer]]|title=Martech Games – The Personality People|page=51|issue=133|first=Graeme|last=Mason|url=https://issuu.com/michelfranca/docs/retro_gamer____133}}</ref> Motion-capture was later notably used to animate the [[3D computer graphics|3-D]] character models in the [[Sega Model 2]] [[arcade game]] ''[[Virtua Fighter 2]]'' in 1994.<ref>{{cite web|last=Wawro|first=Alex|title=Yu Suzuki Recalls Using Military Tech to Make Virtua Fighter 2 |url=httphttps://www.gamasutragamedeveloper.com/viewbusiness/news/228512/Yu_Suzuki_recalls_using_military_tech_to_make_Virtua_Fighter_2.phpyu-suzuki-recalls-using-military-tech-to-make-i-virtua-fighter-2-i-|website=[[Gamasutra]]|access-date=August 18, 2016|date=October 23, 2014}}</ref> In 1995, examples included the [[Atari Jaguar]] CD-based game ''[[Highlander: The Last of the MacLeods]]'',<ref>[http://www.atarimax.com/freenet/freenet_material/6.16and32-BitComputersSupportArea/8.OnlineMagazines/showarticle.php?569 ''Atari Explorer Online''], Vol 04 Iss 09, January 1, 1996 (retrieved August 12, 2012).</ref><ref>[http://radoff.com/blog/2008/08/22/anatomy-of-an-mmorpg/ Jon Radoff, "Anatomy of an MMORPG"] {{webarchive|url=https://web.archive.org/web/20091213053756/http://radoff.com/blog/2008/08/22/anatomy-of-an-mmorpg/ |date=December 13, 2009 }} (retrieved August 12, 2012).</ref> and the arcade [[fighting game]] ''[[Soul Edge]]'', which was the first video game to use [[Motion capture#Passive markers|passive optical]] motion-capture technology.<ref>{{Cite web | url=http://www.motioncapturesociety.com/resources/industry-history | title=History of Motion Capture | access-date=September 14, 2014 | archive-url=https://web.archive.org/web/20120514044040/http://www.motioncapturesociety.com/resources/industry-history | archive-date=May 14, 2012 | url-status=dead }}</ref>
 
Another breakthrough where a cinema film used motion-capture was creating hundreds of digital characters for the film ''[[Titanic (1997 film)|Titanic]]'' in 1997. The technique was used extensively in 1999 to create Jar-Jar Binks and other digital characters in ''[[Star Wars: Episode I – The Phantom Menace]]''.
Line 233 ⟶ 249:
 
* [[Wavefront Technologies|Wavefront]] followed the success of ''Personal Visualiser'' with the release of ''Dynamation'' in 1992, a powerful tool for interactively creating and modifying realistic, natural images of dynamic events. In 1993, Wavefront acquired Thomson Digital Images (TDI), with their innovative product ''Explore'', a tool suite that included ''3Design'' for modelling, ''Anim'' for animation, and ''Interactive Photorealistic Renderer'' (IPR) for rendering. In 1995, Wavefront was bought by [[Silicon Graphics]], and merged with [[Alias Systems Corporation|Alias]].<ref>[http://design.osu.edu/carlson/history/lesson8.html#wavefront "Commercial animation software companies – Wavefront"] {{Webarchive|url=https://web.archive.org/web/20140618212520/http://design.osu.edu/carlson/history/lesson8.html#wavefront |date=June 18, 2014 }}, Wayne Carlson, Ohio State University (retrieved September 3, 2012).</ref>
* [[Alias Systems Corporation|Alias Research]] continued the success of ''[[PowerAnimator]]'' with movies like ''[[Terminator 2: Judgment Day]]'', ''[[Batman Returns]]'' and ''[[Jurassic Park (film)|Jurassic Park]]'', and in 1993 started the development of a new entertainment software, which was later to be named ''[[Autodesk Maya|Maya]]''. Alias found customers in animated film, TV series, visual effects, and video games, and included many prominent studios, such as [[Industrial Light & Magic]], [[Pixar]], [[Sony Pictures Imageworks]], [[Walt Disney]], and [[Warner Bros.]]. Other Alias products were developed for applications in architecture and engineering. In 1995, SGI purchased both Alias Research and Wavefront in a 3-way deal, and the merged company [[Alias Systems Corporation|Alias Wavefront]] was launched.<ref>[http://design.osu.edu/carlson/history/lesson8.html#aliasresearch "Commercial animation software companies – Alias Research"] {{Webarchive|url=https://web.archive.org/web/20140618212520/http://design.osu.edu/carlson/history/lesson8.html#aliasresearch |date=June 18, 2014 }}, Wayne Carlson, Ohio State University (retrieved September 3, 2012).</ref>
* [[Alias Systems Corporation|Alias Wavefront]]'s new mission was to focus on developing the world's most advanced tools for the creation of digital content. ''[[PowerAnimator]]'' continued to be used for visual effects and movies (such as ''[[Toy Story]]'', ''[[Casper (film)|Casper]]'', and ''[[Batman Forever]]''), and also for video games. Further development of the ''Maya'' software went ahead, adding new features such as motion-capture, facial animation, motion blur, and "time warp" technology. [[Computer-aided design|CAD]] industrial design products like ''[[Autodesk AliasStudio|AliasStudio]]'' and ''Alias Designer'' became standardized on Alias|Wavefront software. In 1998, Alias|Wavefront launched ''[[Autodesk Maya|Maya]]'' as its new 3-D flagship product, and this soon became the industry's most important animation tool. ''Maya'' was the merger of three packages—Wavefront's ''Advanced Visualizer'', Alias's ''Power Animator'', and TDI's ''Explore''. In 2003 the company was renamed simply "Alias". In 2004, SGI sold the business to a private investment firm, and it was later renamed to [[Alias Systems Corporation]]. In 2006, the company was bought by [[Autodesk]].<ref>[http://design.osu.edu/carlson/history/lesson8.html#aw "Commercial animation software companies – Alias|Wavefront"] {{Webarchive|url=https://web.archive.org/web/20140618212520/http://design.osu.edu/carlson/history/lesson8.html#aw |date=June 18, 2014 }}, Wayne Carlson, Ohio State University (retrieved September 3, 2012).</ref><ref>[https://web.archive.org/web/20040622205615/http://www.aliaswavefront.com/eng/about/history/index.shtml "About Alias"] at ''Wayback Machine'' (retrieved September 3, 2012).</ref>
* [[Softimage (company)|Softimage]] developed further features for ''Creative Environment'', including the ''Actor Module'' (1991) and ''Eddie'' (1992), including tools such as inverse kinematics, enveloping, metaclay, flock animation, and many others. Softimage customers include many prominent production companies, and Softimage has been used to create animation for hundreds of major feature films and games. In 1994, [[Microsoft]] acquired Softimage, and renamed the package ''[[Softimage 3D]]'', releasing a [[Windows NT]] port two years later.<ref>[http://www.microsoft.com/presspass/press/1996/jan96/3danimpr.mspx "3D – press release"] {{Webarchive|url=https://web.archive.org/web/20111229141042/http://www.microsoft.com/presspass/press/1996/jan96/3danimpr.mspx |date=December 29, 2011 }}, ''Microsoft'', 1996-1 (retrieved July 7, 2012).</ref><ref>[https://www.nytimes.com/1994/02/15/business/company-news-an-acquisition-by-microsoft.html "COMPANY NEWS; An Acquisition By Microsoft"], ''The New York Times'', February 15, 1994 (retrieved July 7, 2012).</ref> In 1998, after helping to port the products to Windows and financing the development of ''[[Autodesk Softimage|Softimage]]'' and ''Softimage|DS'', Microsoft sold the Softimage unit to [[Avid Technology]], who was looking to expand its visual effect capabilities. Then, in 2008, Autodesk acquired the brand and the animation assets of Softimage from Avid, thereby ending Softimage Co. as a distinct entity. The video-related assets of Softimage, including ''Softimage|DS'' (now ''Avid|DS'') continue to be owned by Avid.<ref>[http://www.prnewswire.co.uk/cgi/news/release?id=35215 "Pr Newswire Uk: Avid Technology To Acquire Softimage Subsidiary Of Microsoft Corporation"], ''Prnewswire.co.uk'' (retrieved July 7, 2012).</ref><ref>[http://design.osu.edu/carlson/history/lesson8.html#softimage "Commercial animation software companies – Softimage"] {{Webarchive|url=https://web.archive.org/web/20140618212520/http://design.osu.edu/carlson/history/lesson8.html#softimage |date=June 18, 2014 }}, Wayne Carlson, Ohio State University (retrieved September 3, 2012).</ref>
Line 269 ⟶ 285:
* [[Poser (software)|Poser]] is another DIY 3-D graphics program especially aimed at user-friendly animation of [[wikt:soft|soft]] objects
* [[Pointstream Software]] is a professional [[optical flow]] program that uses a [[pixel]] as its basic primitive form usually tracked over a [[multi-camera setup]] from the esteemed [[Arius3D]], makers of the [[Cartesian coordinate system|XYZ]] [[RGB]] [[3D scanner|scanner]], used in the production process of the Matrix sequels
* [[Adobe Substance]] is a software that allows artists to create 3-D assets, models, materials, patterns, and [https://discover.therookies.co/2023/05/02/40-mind-blowing-digital-art-projects-created-with-adobe-substance-designer/ lighting].
 
==CGI in the 2010s==
Line 289 ⟶ 305:
| url = http://www.debevec.org
| access-date = July 31, 2013}}
</ref> The end result, both precomputed and [[real-time computer graphics|real-time rendered]] with the state-of-the-art [[Graphics processing unit]]: ''Digital Ira'',<ref name="Deb2013"/> looks fairly realistic. Techniques previously confined to high-end virtual cinematography systems are rapidly moving into the video games and [[leisure]] [[Application software|applications]].
 
==Further developments==
Line 302 ⟶ 318:
[[Category:Computer-related introductions in 1960]]
[[Category:Computer animation| ]]
[[Category:History of animation|computer animation]]
[[Category:History of computing|computer animation]]
[[Category:New media]]
[[Category:Multimedia]]