Graph neural network: Difference between revisions

Content deleted Content added
introduce spatial approaches
attentional methods
Line 68:
 
==== Spatial approaches ====
Spatial approaches directly design convolution operation on the graph based on the graph topology (hence called '''spatial'''), making these methods more flexible compared with spectral approaches. Since the size of neighbors is mostly different for each node within a graph, designing an efficient way to define receptive fields and feature propagation is the prime challenge of such approaches. Unlike spectral approaches that severely affected by the global graph structure, spatial approaches mostly focus on local relations between nodes and edge properties, and the global properties can be found by apply pooling mechanisms between convolution layers properly.
 
===== GAT<ref>{{Cite journal|last=Veličković|first=Petar|last2=Cucurull|first2=Guillem|last3=Casanova|first3=Arantxa|last4=Romero|first4=Adriana|last5=Liò|first5=Pietro|last6=Bengio|first6=Yoshua|date=2018-02-04|title=Graph Attention Networks|url=http://arxiv.org/abs/1710.10903|journal=International Conference on Learning Representations (ICLR), 2018}}</ref> and GaAN<ref>{{Cite journal|last=Zhang|first=Jiani|last2=Shi|first2=Xingjian|last3=Xie|first3=Junyuan|last4=Ma|first4=Hao|last5=King|first5=Irwin|last6=Yeung|first6=Dit-Yan|date=2018-03-20|title=GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs|url=http://arxiv.org/abs/1803.07294|journal=arXiv:1803.07294 [cs]}}</ref> =====
[[File:Graph attention network.png|thumb|310x310px|An illustration of k-head-attention-based GNN (GAT here). This example is when k=3.]]
[[Attention (machine learning)|Attentional networks]] have already gain huge success in multiple deep learning areas, especially sequenced data related works. As nodes features of a graph can be represented as a unordered dat sequence, the graph attentional network (GAT) and the gated attention network (GaAN) make use of the benefit that multi-head attention model can automatically learn the importance of each neighbor with respect to different heads.
 
== References ==