User:Generalartificialintelligence/sandbox: Difference between revisions

Content deleted Content added
Line 55:
<ref>[https://ismm.ircam.fr/people/ Past and current members of the Sound Music Movement Interaction team at IRCAM]</ref>
 
Under the supervision of [https://www.ircam.fr/person/frederic-bevilacqua/ Frederic Bevilacqua], he started experimenting with the use of [[Artificial Intelligence]] and human movements. He contributed to the creation of [http://ismm.ircam.fr/gesture-follower/ Gesture Follower], a software used to analyse body movements of [[Performing_arts|performers]] and dancers through [[Motion_detection|motion sensors]] in order to control sound and visual media in [[real-time]]. Different human gestures could trigger different media as well as slowing and speeding up their reproduction based on the way such gestures are performed.
<ref>[https://www.upf.edu/web/mtg/news/-/asset_publisher/WM181VyAQipW/content/id/8743476/maximized#.YASOX-lKhTY Seminar by Bruno Zamborlin on Gesture interaction], Music Technology Group, University of Pompeu Fabra, 5 October 2011</ref>