Content deleted Content added
ce |
Citation bot (talk | contribs) Altered template type. Add: class, eprint, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | #UCB_toolbar |
||
Line 4:
'''Graph neural networks''' ('''GNN''') are specialized [[artificial neural network]]s that are designed for tasks whose inputs are [[Graph (abstract data type)|graphs]].<ref name="wucuipeizhao2022" /><ref name="scarselli2009" /><ref name="micheli2009" /><ref name="sanchez2021" /><ref name="daigavane2021" />
One prominent example is molecular drug design.<ref>{{Cite journal |last1=Stokes |first1=Jonathan M. |last2=Yang |first2=Kevin |last3=Swanson |first3=Kyle |last4=Jin |first4=Wengong |last5=Cubillos-Ruiz |first5=Andres |last6=Donghia |first6=Nina M. |last7=MacNair |first7=Craig R. |last8=French |first8=Shawn |last9=Carfrae |first9=Lindsey A. |last10=Bloom-Ackermann |first10=Zohar |last11=Tran |first11=Victoria M. |last12=Chiappino-Pepe |first12=Anush |last13=Badran |first13=Ahmed H. |last14=Andrews |first14=Ian W. |last15=Chory |first15=Emma J. |date=2020-02-20 |title=A Deep Learning Approach to Antibiotic Discovery |journal=Cell |volume=180 |issue=4 |pages=688–702.e13 |doi=10.1016/j.cell.2020.01.021 |issn=1097-4172 |pmc=8349178 |pmid=32084340}}</ref><ref>{{cite
The key design element of GNNs is the use of ''pairwise message passing'', such that graph nodes iteratively update their representations by exchanging information with their neighbors. Several GNN architectures have been proposed,<ref name="scarselli2009" /><ref name="micheli2009" /><ref name="kipf2016" /><ref name="hamilton2017" /><ref name="velickovic2018" /> which implement different flavors of message passing,<ref name="bronstein2021" /><ref name="hajij2022" /> started by recursive<ref name="scarselli2009" /> or convolutional constructive<ref name="micheli2009" /> approaches. {{As of|2022}}, it is an open question whether it is possible to define GNN architectures "going beyond" message passing, or instead every GNN can be built on message passing over suitably defined graphs.<ref name="velickovic2022" />
Line 164:
{{See also|Natural language processing}}
Graph-based representation of text helps to capture deeper semantic relationships between words. Many studies have used graph networks to enhance performance in various text processing tasks such as text classification, question answering, Neural Machine Translation (NMT), event extraction, fact verification, etc.<ref>{{Cite journal |
==References==
|