User:Generalartificialintelligence/sandbox: Difference between revisions

Content deleted Content added
No edit summary
Blanked the page
Tag: Blanking
 
(32 intermediate revisions by 2 users not shown)
Line 1:
{{Infobox person
| name = Bruno Zamborlin
| image = BrunozamborlinPhotograph.jpg
| alt = Bruno Zamborlin in 2018
| birth_date = {{birth year and age|1983|08}}
| birth_place = [[Vicenza]], [[Italy]]
| alma_mater = [[Goldsmiths,_University_of_London|Goldsmiths, University of London]]<br>
| occupation = Chief Executive Officer <br> Entrepreneur <br> Artist
| known_for = [https://www.mogees.co.uk/ Mogees], [https://www.hypersurfaces.com/ HyperSurfaces]
}}
 
 
'''Bruno Zamborlin''', PhD (born 1983 in [[Vicenza]]) is an Italian [[AI]] [[researcher]] and [[entrepreneur]] based in [[London]], [[UK]].<br />
He is considered a pioneer in the field of [[Human–computer_interaction|human-computer interaction]]. His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and [[Artificial Intelligence]].
<ref>{{cite news
| last = Vdovin
| first = Marsha
| date = 23 June 2014
| title = An Interview with Bruno Zamborlin
| url = https://cycling74.com/articles/an-interview-with-bruno-zamborlin
| work = Cycling74
| ___location = San Francisco
| access-date = 17 January 2019
}}</ref>
<ref>{{cite news
| last = Tardif
| first = Antoine
| date = 29 December 2020
| title = Bruno Zamborlin, CEO and Chief Scientist at Hypersurfaces – Interview Series
| url = https://www.unite.ai/bruno-zamborlin-ceo-and-chief-scientist-at-hypersurfaces-interview-series/
| work = unite.ai
| access-date = 18 March 2019
}}</ref>
<ref>{{cite news
| author =<!--Staff writer(s)/no by-line.-->
| title = Bruno Zamborlin, PhD Feature
| url = https://coruzant.com/feature/bruno-zamborlin-phd/
| work = Coruzant Technologies
| date = 1 July 2020
| access-date = 23 July 2019
}}</ref>
 
In 2013 he founded [https://www.mogees.co.uk/ Mogees Limited], a London based start-up whose products enable users to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone.
In 2017 he founded [https://www.hypersurfaces.com/ HyperSurfaces], a London-Los Angeles based company whose technology converts physical surfaces of any material, shape and form into data-enabled, interactive surfaces using a vibration sensor and a coin-sized chipset.
As an artist, he produces art installations around the world and performs with UK-based electronic music duo [[Plaid_(band)|Plaid]] ([[Warp_(record_label)|Warp records]]).
 
== Early life & education ==
Zamborlin was born in Vicenza, Italy.
After attending a degree in Mathematics between [[University of Padua]] (IT) and [http://liacs.leidenuniv.nl/ Liacs] (Leiden Institute of Advanced Computer Science, NL), he enrolled in a master degree in Computer Science in [[University of Bologna]] (IT) which he has been awarded with summa cum laude.
In 2008 Zamborlin moved to [[Paris]], where he lived for 3 years. During this time he has been working at the [[IRCAM]] (Institute for Research and Coordination Acoustic Musical) – [[Centre Pompidou]] as a member of the [http://ismm.ircam.fr/people/ Sound Music Movement Interaction] team.
 
Under the supervision of [https://www.ircam.fr/person/frederic-bevilacqua/ Frederic Bevilacqua], he started experimenting with the use of [[Artificial Intelligence]] and human movements. He contributed to the creation of [http://ismm.ircam.fr/gesture-follower/ Gesture Follower], a software used to analyse body movements of performers and dancers through [[Motion_detection|motion sensors]] in order to control sound and visual media in [[real-time]]. Different human gestures could trigger different media as well as slowing and speeding up their reproduction based on the way such gestures are performed.
In 2011 Zamborlin relocated to London, where in 2013 he was granted a joint [[Doctor_of_Philosophy|PhD]] between [[Goldsmiths,_University_of_London|Goldsmiths, University of London]] and [[IRCAM|IRCAM - Centre Pompidou]]/[[Pierre_and_Marie_Curie_University|Pierre and Marie Curie University, Paris 6]] in AI with the title [https://edb.upmc.fr/projet-recherche-doctoraux/afficher/4184 'Designing for Appropriation of Digital Music Technology']. The PhD focuses on the concept of Interactive Machine Learning
<ref name="iml"> [https://link.springer.com/article/10.1007/s10489-018-1361-5 Interactive machine learning: experimental evidence for the human in the algorithmic loop]. Holzinger, A., Plass, M., Kickmeier-Rust, M. et al. Interactive machine learning: experimental evidence for the human in the algorithmic loop. Appl Intell 49, 2401–2414 (2019)</ref>
applied to digital musical instruments and performing arts.
The jury panel included [[Reactable]]'s inventor [[Sergi_Jorda|Sergi Jorda]].
 
==Business career==
 
===Mogees===
Zamborlin founded Mogees Limited in 2013 in [[London]], with [[IRCAM]] amongst the early investors in the company.
 
Mogees is a product that enables users to transform physical objects into musical instruments and games using a [[Piezoelectric_sensor|vibration sensor]] and a series of apps for [[smartphones]] and [[Desktop_computer|desktop]].
<ref>{{cite news
| last = Nagle
| first = Paul
| date = March 2016
| title = Mogees: Resynthesis App & Sensor For iOS & Mac
| url = https://www.soundonsound.com/reviews/mogees
| work = Sound on Sound
| access-date = 2 October 2020
}}</ref>
<ref>{{cite news
| last = Solon
| first = Olivia
| date = 1 April 2012
| title = Mogees Project Turns Any Surface Into a Gestural Musical Interface
| url = https://www.wired.com/2012/01/mogees/
| work = Wired.com
| access-date = 2 October 2020
}}</ref>
<ref>{{cite news
| last = O'Hear
| first = Steve
| date = 25 May 2017
| title = Mogees picks up seed funding to put audio-based gesture recognition tech into new devices
| url = https://techcrunch.com/2017/05/25/mogees-seed/
| work = TechCrunch
| access-date = 2 October 2020
}}</ref>
<ref>{{cite news
| last = Madelaine
| first = Nicolas
| date = 22 August 2016
| title = Mogees, ou la réalité virtuelle sonore pour tous
| url = https://www.lesechos.fr/2016/08/mogees-ou-la-realite-virtuelle-sonore-pour-tous-215245
| work = Les Echos
| access-date = 2 October 2020
}}</ref>
<ref>{{cite news
| last = Rociola
| first = Arcangelo
| date = 30 September 2014
| title = Mogees: an Italian’s startup that is making the whole world play music (from trees to DJ’s
| url = https://startupitalia.eu/146989-20140930-mogees-italians-startup-making-whole-world-play-music-trees-djs
| work = StartupItalia
| access-date = 10 October 2020
}}</ref>
 
After a successful [[crowdfunding]] campaign on [[Kickstarter]] in 2014
<ref>{{cite news
| author =<!--Staff writer(s)/no by-line.-->
| date = 5 March 2014
| title = Kickstarter success for gadget that turns everyday objects into instruments
| url = https://www.factmag.com/2014/03/05/mogees-kickstarter-success-for-gadget-that-turns-everyday-objects-into-instruments/
| work = Fact Magazine
| access-date = 15 October 2020
}}</ref>
, Mogees reportedly sold 100,000 units and has been used by artists such as
[[Rodrigo_y_Gabriela|Rodrigo y Gabriela]]
<ref name="ryg">[http://www.rodgab.com/rod-gab-mogees/ Rodrigo y Gabriela's website]</ref>,
[[Jean-Michel_Jarre|Jean Michel Jarre]]
<ref name="jmj">[http://vientdemee.blogspot.com/2014/06/bruno-zamborlin-mogees.html Bruno Zamborlin and Mogees on Jean Michel Jarre website]</ref>
and [[Plaid]]
<ref name="elex">[https://www.youtube.com/watch?v=GPMqAEIBfJM&t=2s&ab_channel=brunozamborlin Plaid and Bruno Zamborlin, ELEX music video]</ref>.
 
===HyperSurfaces===
In 2017 Zamborlin founded HyperSurfaces together with [[Computer_art|computational artist]] [https://scholar.google.com/citations?user=OwW_posAAAAJ&hl=en Parag K Mital]
<ref name="pkm">[http://pkmital.com/home/ Parag K Mital's website]</ref>.
HyperSurfaces is a technology that converts any surface made of any material, shape and size into data-enabled interactive objects, employing a vibration sensor and proprietary [[AI]] algorithms running on a coin-sized [[chipset]]
<ref>{{cite news
| last = O'Hear
| first = Steve
| date = 20 November 2018
| title = HyperSurfaces turns any surface into a user interface using vibration sensors and AI
| url = https://techcrunch.com/2018/11/20/hypersurfaces/
| work = Techcrunch
| access-date = 17 January 2021
}}</ref>.
The vibrations generated by people's interactions on the surface are converted into an electric signal by a [[Piezoelectric_sensor|piezoelectric sensor]] and analysed in [[realtime]] by the AI algorithms that run on the chipset.
Anytime the AI recognises in the vibration signal one of the events that have been predefined by the user beforehand, a corresponding notification message is sent in realtime.
 
The technology can be applied to multiple industry sectors, ranging from button-less [[Human–computer_interaction|human-computer interaction]] applications for automotive and [[Home_automation|smart home]] to the [[Internet_of_things|Internet of Things]]
<ref>{{cite news
| last = Ridden
| first = Paul
| date = 20 November 2018
| title = HyperSurfaces uses AI to make object interfacing more natural
| url = https://newatlas.com/hypersurfaces-ai-vibration-sensor-interface/57316/
| work = NewsAtlas
| access-date = 17 January 2021
}}</ref>
<ref>{{cite news
| author =<!--Staff writer(s)/no by-line.-->
| date = 26 November 2018
| title = HyperSurfaces – Seamlessly Merging The Physical And Data Worlds
| url = https://www.techcompanynews.com/hypersurfaces-seamlessly-merging-the-physical-and-data-worlds/
| work = TechCompanyNews
| access-date = 17 January 2021
}}</ref>
 
Because the AI algorithms employed by HyperSurfaces run locally on a chipset, without the need to access cloud-based services, they are considered to be part of the field of [[Edge_computing|edge AI]].
Also, because the AI can be trained beforehand to recognise the events its users are interested in, HyperSurfaces algorithms belong to the field of [[Supervised_learning|supervised machine learning]].
 
==Art installations and music videos==
 
* [https://www.youtube.com/watch?v=6rCTjBUNX9U&ab_channel=Playtronica Airplane as a musical instrument], Moscow Science Festival, June 2015
* [https://www.youtube.com/watch?v=DJ3h6J0CNZA&ab_channel=Mogees An entire public square in Milan transformed into a giant musical instrument], installation for [[Audi]], Salone del Mobile, May 2018
* [https://www.youtube.com/watch?v=EQhWGjdskDs&t=34s&ab_channel=yuvalgerstein Post Post], with Yuvi Gerstein & Shuli Oded using Mogees
* [https://www.youtube.com/watch?v=o95Momw3-vA&ab_channel=brunozamborlin Diana - performable sound sculpture], MTV days, February 2015. Music by [[Plaid]]
* [https://www.youtube.com/watch?v=c5g8mzvKuA8&t=1s&ab_channel=brunozamborlin Secrets of Nature], London, March 2015
* [https://www.youtube.com/watch?v=bUtTOYakQ_o&ab_channel=MazdaUK Make beautiful music], Mazda 3 commercial, December 2014
 
==Selected talks==
* [https://www.ted.com/talks/bruno_zamborlin_how_vibration_can_turn_any_object_into_data_enabled_interface How vibration can turn any object into a data enabled interface], TEDx San Francisco, 3 October 2019
* [https://www.youtube.com/watch?v=RGbyFxw3-pY&ab_channel=TEDxTalks Transforming everyday objects into musical instruments], TEDx Brussels, 14 November 2012
* [https://www.youtube.com/watch?v=DSDsKn-7jRQ&ab_channel=TEDxTalks The Joy of Creating Music with Augmented Reality], TEDx Milan, 3 November 2017 (Italian language)
* [https://www.youtube.com/watch?v=iIUU_V8bL1s&t=119s&ab_channel=SennheiserMOMENTUM Testimonial for Sennheiser's 'Catch the Momentum' campaign]
* [https://www.youtube.com/watch?v=ysBt--l4Gxw&feature=youtu.be&ab_channel=brunozamborlin Bruno Zamborlin talks about Mogees and Plaid on Wired UK]
 
==Selected awards==
* [http://jsm2012.irisa.fr/toppage.php?page=prixjc IRISA Prix Jeune Chercheur], 13 October 2012
* [http://www.nemode.ac.uk/?page_id=814 NeMoDe, New Economic Models in the Digital Economy], 25 October 2012
 
==Patents and academic publications==
* {{cite patent
| country = United States
| number = US10817798B2
| status = pending
| title = Method to recognize a gesture and corresponding device
| pubdate = 2016-04-27
| pridate = 2016-04-27
| inventor = Bruno Zamborlin
| invent1 = Baptiste Caramiaux
| invent2 = Carmine Emanuele Cella
| assign1 = Mogees Limited
| class = G06F3/017 Gesture based interaction, e.g. based on a set of recognized hand gestures
| url = https://patents.google.com/patent/US10817798B2/en
}}
* {{cite patent
| country = GB
| number = WO/2019/086862
| status = Pending
| title = A user interface for vehicles
| pubdate = 2019-05-09
| pridate = 2017-10-31
| inventor = Bruno Zamborlin
| invent1 = Parag Mital
| invent2 = Conor Barry
| invent3 = Alessandro Saccoia
| invent4 = Baptiste Caramiaux
| assign1 = Mogees Limited
| class = G06F 3/01
| url = https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2019086862
}}
* {{cite patent
| country = GB
| number = WO/2019/086863
| status = Pending
| title = Trigger for game events
| pubdate = 2019-05-09
| pridate = 2017-10-31
| inventor = Bruno Zamborlin
| invent1 = Parag Mital
| invent2 = Conor Barry
| invent3 = Alessandro Saccoia
| invent4 = Baptiste Caramiaux
| assign1 = Mogees Limited
| class = A63F 13/215
| url = https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2019086863
}}
* {{cite journal
| last1 = Bevilacqua
| first1 = Frédéric
| last2 = Zamborlin
| first2 = Bruno
| last3 = Sypniewski
| first3 = Anthony
| last4 = Schnell
| first4 = Norbert
| last5 = Guédy
| first5 = Fabrice
| last6 = Rasamimanana
| first6 = Nicolas
| date = 2010
| title = Continuous Realtime Gesture Following and Recognition
| url = https://link.springer.com/chapter/10.1007/978-3-642-12553-9_7
| journal = Lecture Notes in Computer Science
| volume = 5934
| issue =
| pages = 73-84
| doi = 10.1007/978-3-642-12553-9_7
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Rasamimanana
| first1 = Nicolas
| last2 = Bevilacqua
| first2 = Frédéric
| last3 = Schnell
| first3 = Norbert
| last4 = Guédy
| first4 = Fabrice
| last5 = Flety
| first5 = Emmanuel
| last6 = Maestracci
| first6 = Come
| last7 = Zamborlin
| first7 = Bruno
| date = January 2010
| title = Modular musical objects towards embodied control of digital music
| url = https://dl.acm.org/doi/abs/10.1145/1935701.1935704
| journal = TEI '11: Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
| pages = 9–12
| doi = 10.1145/1935701.1935704
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Bevilacqua
| first1 = Frédéric
| last2 = Schnell
| first2 = Norbert
| last3 = Rasamimanana
| first3 = Nicolas
| last4 = Zamborlin
| first4 = Bruno
| last5 = Guedy
| first5 = Fabrice
| date = 2011
| title = Online Gesture Analysis and Control of Audio Processing
| url = https://link.springer.com/chapter/10.1007/978-3-642-22291-7_8
| journal = Springer Tracts in Advanced Robotics
| volume = 74
| pages = 127-142
| doi = 10.1007/978-3-642-22291-7_8
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Zamborlin
| first1 = Bruno
| last2 = Bevilacqua
| first2 = Frédéric
| last2 = Gillies
| first2 = Marco
| last2 = D'Inverno
| first2 = Mark
| date = 2014-01-15
| title = Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces
| url = https://dl.acm.org/doi/abs/10.1145/2543921
| journal = ACM Transactions on Interactive Intelligent Systems
| pages = 22
| doi = 10.1145/2543921
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Leslie
| first1 = Grace
| last2 = Zamborlin
| first2 = Bruno
| last2 = Schnell
| first2 = Norbert
| last2 = Jodlowski
| first2 = Pierre
| date = 2010-06-15
| title = A Collaborative, Interactive Sound Installation
| url = https://www.researchgate.net/publication/261215607_Grainstick_A_Collaborative_Interactive_Sound_Installation
| journal = Proceedings of the International Computer Music Conference (ICMC)
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Kimura
| first1 = Mari
| last2 = Rasamimanana
| first2 = Nicolas
| last3 = Bevilacqua
| first3 = Frédéric
| last4 = Zamborlin
| first4 = Bruno
| last5 = Schnell
| first5 = Bruno
| last6 = Flety
| first6 = Emmanuel
| date = 2012
| title = Extracting Human Expression For Interactive Composition with the Augmented Violin
| url = https://hal.archives-ouvertes.fr/hal-01161009/
| journal = International Conference on New Interfaces for Musical Expression (NIME)
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Ferretti
| first1 = Stefano
| last2 = Roccetti
| first2 = Marco
| last3 = Zamborlin
| first3 = Bruno
| date = 2009-01-13
| title = On SPAWC: Discussion on a Musical Signal Parser and Well-Formed Composer
| url = https://ieeexplore.ieee.org/abstract/document/4784966
| journal = 6th IEEE Consumer Communications and Networking Conference
| doi = 10.1109/CCNC.2009.4784966
| access-date = 17 January 2021
}}
* {{cite journal
| last1 = Zamborlin
| first1 = Bruno
| last2 = Partesana
| first2 = Giorgio
| last3 = Liuni
| first3 = Marco
| date = 2011-05-15
| title = (LAND)MOVES
| url = https://hal.archives-ouvertes.fr/hal-01161298/document
| journal = Conference on New Interfaces for Musical Expression, NIME
| pages = 537-538
| access-date = 17 January 2021
}}
 
References: {{reflist}}