Graph neural network: Difference between revisions

Content deleted Content added
Added applications of GNNs in some fields; changed 'net' to 'network'
improvement.
Line 3:
Since their inception, variants of the message passing neural network (MPNN) framework have been proposed.<ref>{{Cite journal|last1=Kipf|first1=Thomas N|last2=Welling|first2=Max|date=2016|title=Semi-supervised classification with graph convolutional networks|url=https://ieeexplore.ieee.org/document/4700287|journal=International Conference on Learning Representations|volume=5|issue=1|pages=61–80|doi=10.1109/TNN.2008.2005605|pmid=19068426|arxiv=1609.02907|s2cid=206756462}}</ref><ref>{{Cite journal|last1=Defferrard|first1=Michaël|last2=Bresson|first2=Xavier|last3=Vandergheynst|first3=Pierre|date=2017-02-05|title=Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering|arxiv=1606.09375|journal=Neural Information Processing Systems|volume=30}}</ref><ref>{{Cite journal|last1=Hamilton|first1=William|last2=Ying|first2=Rex|last3=Leskovec|first3=Jure|date=2017|title=Inductive Representation Learning on Large Graphs|url=https://cs.stanford.edu/people/jure/pubs/graphsage-nips17.pdf|journal=Neural Information Processing Systems|volume=31|arxiv=1706.02216|via=Stanford}}</ref><ref>{{Cite journal|last1=Veličković|first1=Petar|last2=Cucurull|first2=Guillem|last3=Casanova|first3=Arantxa|last4=Romero|first4=Adriana|last5=Liò|first5=Pietro|last6=Bengio|first6=Yoshua|date=2018-02-04|title=Graph Attention Networks|arxiv=1710.10903|journal=International Conference on Learning Representations|volume=6}}</ref> These models optimize GNNs for use on larger graphs and apply them to domains such as [[social network]]s, [[Citation graph|citation networks]], and online communities.<ref>{{Cite web|title=Stanford Large Network Dataset Collection|url=https://snap.stanford.edu/data/|access-date=2021-07-05|website=snap.stanford.edu}}</ref> GNNs have also been relatively successful in various [[NP-hard]] combinatorial problems, [[automated planning]] and [[path-planning]] areas due to the inherent graph structure of data. <ref>{{cite journal |last1=Li |first1=Zhuwen |last2=Chen |first2=Qifeng |last3=Koltun |first3=Vladlen |title=Combinatorial optimization with graph convolutional networks and guided tree search |journal=Neural Information Processing Systems |date=2018 |volume=31 |page=537--546 |url=https://arxiv.org/pdf/1810.10659}}</ref> <ref>{{cite journal |last1=Ma |first1=Tengfei |last2=Ferber |first2=Patrick |last3=Huo |first3=Siyu |last4=Chen |first4=Jie |last5=Katz |first5=Michael |title=Online planner selection with graph neural networks and adaptive scheduling |journal=Proceedings of the AAAI Conference on Artificial Intelligence |date=2020 |volume=34 |page=5077--5084 |url=https://ojs.aaai.org/index.php/AAAI/article/download/5949/5805}}</ref> <ref>{{cite journal |last1=Osanlou |first1=Kevin |last2=Bursuc |first2=Andrei |last3=Guettier |first3=Christophe |last4=Cazenave |first4=Tristan |last5=Jacopin |first5=Eric |title=Optimal Solving of Constrained Path-Planning Problems with Graph Convolutional Networks and Optimized Tree Search |journal=2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) |date=2019 |page=3519--3525 |doi=10.1109/IROS40897.2019.8968113 |url= https://arxiv.org/pdf/2108.01036.pdf}}</ref>
 
GNNs are a weak form of the [[Boris Weisfeiler|Weisfeiler]]–Lehman graph isomorphism test,<ref>{{cite arXiv|last=Douglas|first=B. L.|date=2011-01-27|title=The Weisfeiler–Lehman Method and Graph Isomorphism Testing|class=math.CO|eprint=1101.5211}}</ref> so any GNN model is at least as powerful as this test.<ref>{{Cite journal|last1=Xu|first1=Keyulu|last2=Hu|first2=Weihua|last3=Leskovec|first3=Jure|last4=Jegelka|first4=Stefanie|date=2019-02-22|title=How Powerful are Graph Neural Networks?|arxiv=1810.00826|journal=International Conference on Learning Representations|volume=7}}</ref> ResearcherResearchers are attempting to unite GNNs with other "geometric deep learning models"<ref>{{Cite journal|last1=Bronstein|first1=Michael M.|last2=Bruna|first2=Joan|last3=LeCun|first3=Yann|last4=Szlam|first4=Arthur|last5=Vandergheynst|first5=Pierre|date=2017|title=Geometric Deep Learning: Going beyond Euclidean data|url=https://ieeexplore.ieee.org/document/7974879|journal=IEEE Signal Processing Magazine|volume=34|issue=4|pages=18–42|doi=10.1109/MSP.2017.2693418|issn=1053-5888|arxiv=1611.08097|bibcode=2017ISPM...34...18B|s2cid=15195762}}</ref> to better understand how and why these models work.
 
In the case of the absence of a known graph structure for example a k-[[nearest neighbor graph]] can be heuristically induced.
 
GNNs can be understood as a generalization of [[convolutional neural network]]s, (which are used on 2D2-dimensional black and 3Dwhite image data and 3-dimensional color imagingimage data,) to graph-structured data.
 
==References==