Content deleted Content added
No edit summary |
|||
Line 55:
Under the supervision of [https://www.ircam.fr/person/frederic-bevilacqua/ Frederic Bevilacqua], he started experimenting with the use of [[Artificial Intelligence]] and human movements. He contributed to the creation of [http://ismm.ircam.fr/gesture-follower/ Gesture Follower], a software used to analyse body movements of performers and dancers through [[Motion_detection|motion sensors]] in order to control sound and visual media in [[real-time]]. Different human gestures could trigger different media as well as slowing and speeding up their reproduction based on the way such gestures are performed.
<ref>[https://www.upf.edu/web/mtg/news/-/asset_publisher/WM181VyAQipW/content/id/8743476/maximized#.YASOX-lKhTY Seminar by Bruno Zamborlin on Gesture interaction], Music Technology Group, University of Pompeu Fabra, 5 October 2011</ref>
In 2011 Zamborlin relocated to London, where in 2013 he was granted a joint [[Doctor_of_Philosophy|PhD]] between [[Goldsmiths,_University_of_London|Goldsmiths, University of London]] and [[IRCAM|IRCAM - Centre Pompidou]]/[[Pierre_and_Marie_Curie_University|Pierre and Marie Curie University, Paris 6]] in AI with the title [https://edb.upmc.fr/projet-recherche-doctoraux/afficher/4184 'Designing for Appropriation of Digital Music Technology']. The PhD focuses on the concept of Interactive Machine Learning
|