Brain–computer interface: Difference between revisions

Content deleted Content added
Reverted 1 edit by 103.134.219.28 (talk): Not helpful addition
Bender the Bot (talk | contribs)
 
(7 intermediate revisions by one other user not shown)
Line 2:
{{Use dmy dates|date=December 2022}}
{{Short description|Direct communication pathway between an enhanced or wired brain and an external device}}
[[File:Photograph-by-mikeCaiChen.jpg|alt=Participant in a brain-computer interface is Getting connected to a computer|thumb|Participant in a brain-computer interface is getting connected to a computer ]]
[[File:BrainGate.jpg|thumb|Dummy unit illustrating the design of a [[BrainGate]] interface]]
 
Line 72 ⟶ 73:
 
====Donoghue, Schwartz, and Andersen====
[[File:164_Angell_Street.jpg|thumb|BCIs are a core focus of the [[Carney Institute for Brain Science]] at [[Brown University]]. ]]
Other laboratories that have developed BCIs and algorithms that decode neuron signals include [[John Donoghue (neuroscientist)|John Donoghue]] at the [[Carney Institute for Brain Science]] at [[Brown University]], Andrew Schwartz at the [[University of Pittsburgh]], and [[Richard A. Andersen (neuroscientist)|Richard Andersen]] at [[Caltech]]. These researchers produced working BCIs using recorded signals from far fewer neurons than Nicolelis (15–30 neurons versus 50–200 neurons).
 
Line 97:
 
BCI systems can potentially be used to encode signals from the periphery. These sensory BCI devices enable real-time, behaviorally-relevant decisions based upon closed-loop neural stimulation.<ref>{{cite journal | vauthors = Richardson AG, Ghenbot Y, Liu X, Hao H, Rinehart C, DeLuccia S, Torres Maldonado S, Boyek G, Zhang M, Aflatouni F, Van der Spiegel J, Lucas TH | display-authors = 6 | title = Learning active sensing strategies using a sensory brain-machine interface | journal = Proceedings of the National Academy of Sciences of the United States of America | volume = 116 | issue = 35 | pages = 17509–17514 | date = August 2019 | pmid = 31409713 | pmc = 6717311 | doi = 10.1073/pnas.1909953116 | bibcode = 2019PNAS..11617509R | doi-access = free }}</ref>
 
====The BCI Award====
The [[Annual BCI Research Award|BCI Research Award]] is awarded annually in recognition of innovative research. Each year, a renowned research laboratory is asked to judge projects. The jury consists of BCI experts recruited by that laboratory. The jury selects twelve nominees, then chooses a first, second, and third-place winner, who receive awards of $3,000, $2,000, and $1,000, respectively.{{cn|date=April 2025}}
 
==Human research==
Line 124 ⟶ 121:
====Communication====
In May 2021, a Stanford University team reported a successful proof-of-concept test that enabled a quadraplegic participant to produce English sentences at about 86 characters per minute and 18 words per minute. The participant imagined moving his hand to write letters, and the system performed handwriting recognition on electrical signals detected in the motor cortex, utilizing [[Hidden Markov models]] and [[recurrent neural networks]].<ref>{{cite journal | vauthors = Willett FR, Avansino DT, Hochberg LR, Henderson JM, Shenoy KV | title = High-performance brain-to-text communication via handwriting | journal = Nature | volume = 593 | issue = 7858 | pages = 249–254 | date = May 2021 | pmid = 33981047 | pmc = 8163299 | doi = 10.1038/s41586-021-03506-2 | bibcode = 2021Natur.593..249W }}</ref><ref>{{cite book | vauthors = Willett FR |title=Brain-Computer Interface Research: A State-of-the-Art Summary 10|chapter=A High-Performance Handwriting BCI|date=2021 |pages=105–109| veditors = Guger C, Allison BZ, Gunduz A |series=SpringerBriefs in Electrical and Computer Engineering|place=Cham|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-79287-9_11|isbn=978-3-030-79287-9 |s2cid=239736609}}</ref>
[[File:Photograph-by-mikeCaiChen.jpg|alt=Participant in a brain-computer interface is Getting connected to a computer|thumb|Participant in a brain-computer interface is getting connected to a computer ]]
Since researchers from [[University of California, San Francisco|UCSF]] initiated a brain-computer interface (BCI) study, numerous reports have been made. In 2021, they reported that a paralyzed and with [[Dysarthria|anarthria]] man was able to communicate fifteen words per minute using an implanted device that examined nerve cells controlling the muscles of the vocal tract.<ref>{{Cite journal |last1=Moses |first1=David A. |last2=Metzger |first2=Sean L. |last3=Liu |first3=Jessie R. |last4=Anumanchipalli |first4=Gopala K. |last5=Makin |first5=Joseph G. |last6=Sun |first6=Pengfei F. |last7=Chartier |first7=Josh |last8=Dougherty |first8=Maximilian E. |last9=Liu |first9=Patricia M. |last10=Abrams |first10=Gary M. |last11=Tu-Chan |first11=Adelyn |last12=Ganguly |first12=Karunesh |last13=Chang |first13=Edward F. |date=2021-07-14 |title=Neuroprosthesis for Decoding Speech in a Paralyzed Person with Anarthria |journal=New England Journal of Medicine |volume=385 |issue=3 |pages=217–227 |doi=10.1056/NEJMoa2027540 |issn=0028-4793 |pmc=8972947 |pmid=34260835}}</ref><ref>{{Cite journal |last1=Maiseli |first1=Baraka |last2=Abdalla |first2=Abdi T. |last3=Massawe |first3=Libe V. |last4=Mbise |first4=Mercy |last5=Mkocha |first5=Khadija |last6=Nassor |first6=Nassor Ally |last7=Ismail |first7=Moses |last8=Michael |first8=James |last9=Kimambo |first9=Samwel |date=2023-08-04 |title=Brain–computer interface: trend, challenges, and threats |journal=Brain Informatics |volume=10 |issue=1 |pages=20 |doi=10.1186/s40708-023-00199-3 |doi-access=free |issn=2198-4026 |pmc=10403483 |pmid=37540385}}</ref> In addition in 2022 it was announced that their implant could also be used to spell out words and entire sentences without speaking aloud. The first bilingual speech neuroprosthesis was reported to have been developed by the same team at the University of San Francisco, in 2024.<ref>{{Cite journal |last=Matsiko |first=Amos |date=2024-08-21 |title=Bilingual speech neuroprosthesis |url=https://www.science.org/doi/10.1126/scirobotics.ads4122 |journal=Science Robotics |volume=9 |issue=93 |pages=eads4122 |doi=10.1126/scirobotics.ads4122|url-access=subscription }}</ref><ref>{{Cite journal |last1=Silva |first1=Alexander B. |last2=Liu |first2=Jessie R. |last3=Metzger |first3=Sean L. |last4=Bhaya-Grossman |first4=Ilina |last5=Dougherty |first5=Maximilian E. |last6=Seaton |first6=Margaret P. |last7=Littlejohn |first7=Kaylo T. |last8=Tu-Chan |first8=Adelyn |last9=Ganguly |first9=Karunesh |last10=Moses |first10=David A. |last11=Chang |first11=Edward F. |date=August 2024 |title=A bilingual speech neuroprosthesis driven by cortical articulatory representations shared between languages |journal=Nature Biomedical Engineering |language=en |volume=8 |issue=8 |pages=977–991 |doi=10.1038/s41551-024-01207-5 |pmid=38769157 |pmc=11554235 |issn=2157-846X}}</ref><ref>{{Cite web |date=2024-05-28 |title=Bilingual AI brain implant helps stroke survivor communicate in Spanish and English |url=https://www.nbcnews.com/news/latino/bilingual-ai-brain-implant-spanish-english-stroke-patient-rcna154295 |access-date=2025-06-23 |website=NBC News |language=en}}</ref>
 
Line 149 ⟶ 145:
 
==== Electrocorticography ====
[[Electrocorticography]] (ECoG) measures brain electrical activity from beneath the skull in a way similar to non-invasive electroencephalography, using electrodes embedded in a thin plastic pad placed above the cortex, beneath the [[dura mater]].<ref>{{cite book | last1=Serruya | first1=Mijail | last2=Donoghue | first2=John | chapter = Chapter III: Design Principles of a Neuromotor Prosthetic Device | title = Neuroprosthetics: Theory and Practice | veditors = Horch KW, Dhillon GS | publisher = Imperial College Press | year=2004 |pages=1158–1196 | doi=10.1142/9789812561763_0040 | archive-url=https://web.archive.org/web/20050404155139/http://donoghue.neuro.brown.edu/pubs/2003-SerruyaDonoghue-Chap3-preprint.pdf | archive-date=4 April 2005 |chapter-url=httphttps://donoghue.neuro.brown.edu/pubs/2003-SerruyaDonoghue-Chap3-preprint.pdf}}</ref> ECoG technologies were first trialled in humans in 2004 by Eric Leuthardt and Daniel Moran from [[Washington University in St. Louis]]. In a later trial, the researchers enabled a teenage boy to play [[Space Invaders]].<ref>{{cite web | url = http://news-info.wustl.edu/news/page/normal/7800.html | title = Teenager moves video icons just by imagination | work = Press release | publisher = Washington University in St Louis | date = 9 October 2006 }}</ref> This research indicates that control is rapid, requires minimal training, balancing signal fidelity and level of invasiveness.{{refn|group=note|These electrodes had not been implanted in the patient with the intention of developing a BCI. The patient had had severe [[epilepsy]] and the electrodes were temporarily implanted to help his physicians localize seizure foci; the BCI researchers simply took advantage of this.<ref>{{cite journal | vauthors = Schalk G, Miller KJ, Anderson NR, Wilson JA, Smyth MD, Ojemann JG, Moran DW, Wolpaw JR, Leuthardt EC | display-authors = 6 | title = Two-dimensional movement control using electrocorticographic signals in humans | journal = Journal of Neural Engineering | volume = 5 | issue = 1 | pages = 75–84 | date = March 2008 | pmid = 18310813 | pmc = 2744037 | doi = 10.1088/1741-2560/5/1/008 | bibcode = 2008JNEng...5...75S }}</ref>}}
 
Signals can be either subdural or epidural, but are not taken from within the brain [[parenchyma]]. Patients are required to have invasive monitoring for localization and resection of an epileptogenic focus.{{Citation needed|date=May 2024}}
Line 286 ⟶ 282:
 
In 2004 Thomas DeMarse at the [[University of Florida]] used a culture of 25,000 neurons taken from a rat's brain to fly a [[F-22]] fighter jet [[aircraft simulator]]. After collection, the cortical neurons were cultured in a [[petri dish]] and reconnected themselves to form a living neural network. The cells were arranged over a grid of 60 electrodes and used to control the [[Aircraft principal axes|pitch]] and [[Aircraft principal axes|yaw]] functions of the simulator. The study's focus was on understanding how the human brain performs and learns computational tasks at a cellular level.<ref>{{Cite news |url=http://www.cnn.com/2004/TECH/11/02/brain.dish/ |title='Brain' in a dish flies flight simulator |work=CNN |date=4 November 2004}}</ref>
 
==Collaborative BCIs==
The idea of combining/integrating brain signals from multiple individuals was introduced at Humanity+ @Caltech, in December 2010, by Adrian Stoica, who referred to the concept as multi-brain aggregation.<ref>{{Cite web|date=2017-10-05|title=David Pearce – Humanity Plus|url=https://activistjourneys.wordpress.com/david-pearce-humanity-plus/|access-date=2021-12-30|language=en}}</ref><ref>{{Cite web|vauthors=Stoica A|date=2010|title=Speculations on Robots, Cyborgs & Telepresence|website=[[YouTube]]|url=https://www.youtube.com/watch?v=nqByb7VEnZk|url-status=live|archive-url=https://web.archive.org/web/20211228222826/https://www.youtube.com/watch?v=nqByb7VEnZk|archive-date=28 December 2021|access-date=28 December 2021}}</ref><ref>{{Cite web|title=Experts to 'redefine the future' at Humanity+ @ CalTech |website=Kurzweil|url=https://www.kurzweilai.net/experts-to-redefine-the-future-at-humanity-caltech|access-date=2021-12-30|language=en-US}}</ref> A patent was applied for in 2012.<ref>{{Cite patent|number=WO2012100081A2|title=Aggregation of bio-signals from multiple individuals to achieve a collective outcome|gdate=2012-07-26|invent1=Stoica|inventor1-first=Adrian|url=https://patents.google.com/patent/WO2012100081A2/en}}</ref><ref>{{cite journal | vauthors = Wang Y, Jung TP | title = A collaborative brain-computer interface for improving human performance | journal = PLOS ONE | volume = 6 | issue = 5 | pages = e20422 | date = 2011-05-31 | pmid = 21655253 | pmc = 3105048 | doi = 10.1371/journal.pone.0020422 | bibcode = 2011PLoSO...620422W | doi-access = free }}</ref><ref>{{cite journal | vauthors = Eckstein MP, Das K, Pham BT, Peterson MF, Abbey CK, Sy JL, Giesbrecht B | title = Neural decoding of collective wisdom with multi-brain computing | journal = NeuroImage | volume = 59 | issue = 1 | pages = 94–108 | date = January 2012 | pmid = 21782959 | doi = 10.1016/j.neuroimage.2011.07.009 | s2cid = 14930969 }}</ref> Stoica's first paper on the topic appeared in 2012, after the publication of his patent application.<ref>{{Cite book| vauthors = Stoica A |title= 2012 Third International Conference on Emerging Security Technologies |chapter= MultiMind: Multi-Brain Signal Fusion to Exceed the Power of a Single Brain |date= September 2012 |pages=94–98 |doi=10.1109/EST.2012.47|isbn= 978-0-7695-4791-6 |s2cid= 6783719 }}</ref>
 
== Ethical considerations==
Line 300 ⟶ 293:
 
==Low-cost systems==
{{main|Consumer brain–computer interfaces}}
Various companies are developing inexpensive BCIs for research and entertainment. Toys such as the NeuroSky and Mattel MindFlex have seen some commercial success.
* In 2006, [[Sony]] patented a neural interface system allowing radio waves to affect signals in the neural cortex.<ref name="Sony patent neural interface">{{cite news|url=http://www.wikipatents.com/US-Patent-6729337/method-and-system-for-generating-sensory-data-onto-the-human-neural |title=Sony patent neural interface |url-status=dead |archive-url=https://web.archive.org/web/20120407071853/http://www.wikipatents.com/US-Patent-6729337/method-and-system-for-generating-sensory-data-onto-the-human-neural |archive-date=7 April 2012 |df=dmy }}</ref>
Line 399 ⟶ 391:
{{Footer Neuropsychology}}
{{emerging technologies|topics=yes|neuro=yes|infocom=yes}}
{{Extended reality}}
{{Authority control}}{{DEFAULTSORT:Brain-computer interface}}
[[Category:Brain–computer interface| ]]
Line 408 ⟶ 399:
[[Category:Neural engineering|*]]
[[Category:User interface techniques]]
[[Category:Computing input devices]]