User:Generalartificialintelligence/sandbox

This is an old revision of this page, as edited by Generalartificialintelligence (talk | contribs) at 19:55, 17 January 2021 (Early life & education). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Bruno Zamborlin
Bruno Zamborlin in 2018
BornAugust 1983 (age 41–42)
Alma materGoldsmiths, University of London
Occupation(s)Chief Executive Officer
Entrepreneur
Artist
Known forMogees, HyperSurfaces


Bruno Zamborlin, PhD (born 1983 in Vicenza) is an Italian AI researcher and entrepreneur based in London, UK.
He is considered a pioneer in the field of human-computer interaction. [1] [2] [3] His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and Artificial Intelligence.

In 2013 he founded Mogees Limited, a London based start-up whose products enable users to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone. In 2017 he founded HyperSurfaces, a London-Los Angeles based company whose technology converts physical surfaces of any material, shape and form into data-enabled, interactive surfaces using a vibration sensor and a coin-sized chipset.

As an artist, he produces art installations around the world and performs occasionally with UK-based electronic music duo Plaid (Warp records).

Early life & education

Zamborlin was born in Vicenza, Italy. After attending a degree in Mathematics between University of Padua (IT) and Liacs (Leiden Institute of Advanced Computer Science, NL), he enrolled in a master degree in Computer Science in University of Bologna (IT) which he has been awarded with summa cum laude.

In 2008 Zamborlin moved to Paris, where he lived for 3 years. During this time he has been working at the IRCAM (Institute for Research and Coordination Acoustic Musical) – Centre Pompidou as a member of the Sound Music Movement Interaction team. [4]

Under the supervision of Frederic Bevilacqua, he started experimenting with the use of Artificial Intelligence and human movements. He contributed to the creation of Gesture Follower, a software used to analyse body movements of performers and dancers through motion sensors in order to control sound and visual media in real-time. Different human gestures could trigger different media as well as slowing and speeding up their reproduction based on the way such gestures are performed. [5]

In 2011 Zamborlin relocated to London, where in 2013 he was granted a joint PhD between Goldsmiths, University of London and IRCAM - Centre Pompidou/Pierre and Marie Curie University, Paris 6 in AI with the title 'Designing for Appropriation of Digital Music Technology'. The PhD focuses on the concept of Interactive Machine Learning [6] applied to digital musical instruments and performing arts. The jury panel included Reactable's inventor Sergi Jorda.

Business career

Mogees

Zamborlin founded Mogees Limited in 2013 in London, with IRCAM amongst the early investors in the company.

Mogees is a product that enables users to transform physical objects into musical instruments and games using a vibration sensor and a series of apps for smartphones and desktop. [7] [8] [9] [10] [11]

After a successful crowdfunding campaign on Kickstarter in 2014 [12] , Mogees reportedly sold a total of more than 100,000 units worldwide and has been used by artists such as Rodrigo y Gabriela [13], Jean Michel Jarre [14] and Plaid [15].

HyperSurfaces

In 2017 Zamborlin founded HyperSurfaces together with computational artist Parag K Mital [16]. HyperSurfaces is a technology that converts any surface made of any material, shape and size into data-enabled interactive objects, employing a vibration sensor and proprietary AI algorithms running on a coin-sized chipset [17]. The vibrations generated by people's interactions on the surface are converted into an electric signal by a piezoelectric sensor and analysed in realtime by the AI algorithms that run on the chipset. Anytime the AI recognises in the vibration signal one of the events that have been predefined by the user beforehand, a corresponding notification message is sent in realtime.

The technology can be applied to multiple industry sectors, ranging from button-less human-computer interaction applications for automotive and smart home to the Internet of Things [18] [19]

Because the AI algorithms employed by HyperSurfaces run locally on a chipset, without the need to access cloud-based services, they are considered to be part of the field of edge AI. Also, because the AI can be trained beforehand to recognise the events its users are interested in, HyperSurfaces algorithms belong to the field of supervised machine learning.

Art installations and music videos

Selected talks

Selected awards

Patents and academic publications

  • United States pending US10817798B2, Bruno Zamborlin & Carmine Emanuele Cella, "Method to recognize a gesture and corresponding device", published 2016-04-27, assigned to Mogees Limited 
  • GB Pending WO/2019/086862, Bruno Zamborlin; Conor Barry & Alessandro Saccoia et al., "A user interface for vehicles", published 2019-05-09, assigned to Mogees Limited 
  • GB Pending WO/2019/086863, Bruno Zamborlin; Conor Barry & Alessandro Saccoia et al., "Trigger for game events", published 2019-05-09, assigned to Mogees Limited 
  • Bevilacqua, Frédéric; Zamborlin, Bruno; Sypniewski, Anthony; Schnell, Norbert; Guédy, Fabrice; Rasamimanana, Nicolas (2010). "Continuous Realtime Gesture Following and Recognition". Lecture Notes in Computer Science. 5934: 73–84. doi:10.1007/978-3-642-12553-9_7. Retrieved 17 January 2021.
  • Rasamimanana, Nicolas; Bevilacqua, Frédéric; Schnell, Norbert; Guédy, Fabrice; Flety, Emmanuel; Maestracci, Come; Zamborlin, Bruno (January 2010). "Modular musical objects towards embodied control of digital music". TEI '11: Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction: 9–12. doi:10.1145/1935701.1935704. Retrieved 17 January 2021.
  • Bevilacqua, Frédéric; Schnell, Norbert; Rasamimanana, Nicolas; Zamborlin, Bruno; Guedy, Fabrice (2011). "Online Gesture Analysis and Control of Audio Processing". Springer Tracts in Advanced Robotics. 74: 127–142. doi:10.1007/978-3-642-22291-7_8. Retrieved 17 January 2021.
  • Zamborlin, Bruno; D'Inverno, Mark (2014-01-15). "Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces". ACM Transactions on Interactive Intelligent Systems: 22. doi:10.1145/2543921. Retrieved 17 January 2021.
  • Leslie, Grace; Jodlowski, Pierre (2010-06-15). "A Collaborative, Interactive Sound Installation". Proceedings of the International Computer Music Conference (ICMC). Retrieved 17 January 2021.
  • Kimura, Mari; Rasamimanana, Nicolas; Bevilacqua, Frédéric; Zamborlin, Bruno; Schnell, Bruno; Flety, Emmanuel (2012). "Extracting Human Expression For Interactive Composition with the Augmented Violin". International Conference on New Interfaces for Musical Expression (NIME). Retrieved 17 January 2021.
  • Ferretti, Stefano; Roccetti, Marco; Zamborlin, Bruno (2009-01-13). "On SPAWC: Discussion on a Musical Signal Parser and Well-Formed Composer". 6th IEEE Consumer Communications and Networking Conference. doi:10.1109/CCNC.2009.4784966. Retrieved 17 January 2021.
  • Zamborlin, Bruno; Partesana, Giorgio; Liuni, Marco (2011-05-15). "(LAND)MOVES". Conference on New Interfaces for Musical Expression, NIME: 537–538. Retrieved 17 January 2021.

References:

  1. ^ Vdovin, Marsha (23 June 2014). "An Interview with Bruno Zamborlin". Cycling74. San Francisco. Retrieved 17 January 2019.
  2. ^ Tardif, Antoine (29 December 2020). "Bruno Zamborlin, CEO and Chief Scientist at Hypersurfaces – Interview Series". unite.ai. Retrieved 18 March 2019.
  3. ^ "Bruno Zamborlin, PhD Feature". Coruzant Technologies. 1 July 2020. Retrieved 23 July 2019.
  4. ^ Past and current members of the Sound Music Movement Interaction team at IRCAM
  5. ^ Seminar by Bruno Zamborlin on Gesture interaction, Music Technology Group, University of Pompeu Fabra, 5 October 2011
  6. ^ Interactive machine learning: experimental evidence for the human in the algorithmic loop. Holzinger, A., Plass, M., Kickmeier-Rust, M. et al. Interactive machine learning: experimental evidence for the human in the algorithmic loop. Appl Intell 49, 2401–2414 (2019)
  7. ^ Nagle, Paul (March 2016). "Mogees: Resynthesis App & Sensor For iOS & Mac". Sound on Sound. Retrieved 2 October 2020.
  8. ^ Solon, Olivia (1 April 2012). "Mogees Project Turns Any Surface Into a Gestural Musical Interface". Wired.com. Retrieved 2 October 2020.
  9. ^ O'Hear, Steve (25 May 2017). "Mogees picks up seed funding to put audio-based gesture recognition tech into new devices". TechCrunch. Retrieved 2 October 2020.
  10. ^ Madelaine, Nicolas (22 August 2016). "Mogees, ou la réalité virtuelle sonore pour tous". Les Echos. Retrieved 2 October 2020.
  11. ^ Rociola, Arcangelo (30 September 2014). "Mogees: an Italian's startup that is making the whole world play music (from trees to DJ's". StartupItalia. Retrieved 10 October 2020.
  12. ^ "Kickstarter success for gadget that turns everyday objects into instruments". Fact Magazine. 5 March 2014. Retrieved 15 October 2020.
  13. ^ Rodrigo y Gabriela's website
  14. ^ Bruno Zamborlin and Mogees on Jean Michel Jarre website
  15. ^ Plaid and Bruno Zamborlin, ELEX music video
  16. ^ Parag K Mital's website
  17. ^ O'Hear, Steve (20 November 2018). "HyperSurfaces turns any surface into a user interface using vibration sensors and AI". Techcrunch. Retrieved 17 January 2021.
  18. ^ Ridden, Paul (20 November 2018). "HyperSurfaces uses AI to make object interfacing more natural". NewsAtlas. Retrieved 17 January 2021.
  19. ^ "HyperSurfaces – Seamlessly Merging The Physical And Data Worlds". TechCompanyNews. 26 November 2018. Retrieved 17 January 2021.