Graph kernel: Difference between revisions

Content deleted Content added
No edit summary
WikiCleanerBot (talk | contribs)
m v2.04b - Bot T20 CW#61 - WP:WCW project (Reference before punctuation)
Line 46:
==Applications==
 
The marginalized graph kernel has been shown to allow accurate predictions of the atomization energy of small organic molecules.<ref>{{cite journal
|title=Prediction of atomization energy using graph kernel and active learning
|author1=Yu-Hang Tang
Line 59:
|arxiv=1810.07310
|bibcode=2019JChPh.150d4107T
}}</ref>.
 
==Example Kernels==
 
An example of a kernel between graphs is the '''random walk kernel''',<ref name="Gaertner"/><ref name="Kashima"/>, which conceptually performs [[random walk]]s on two graphs simultaneously, then counts the number of [[Path (graph theory)|path]]s that were produced by ''both'' walks. This is equivalent to doing random walks on the [[Tensor product of graphs|direct product]] of the pair of graphs, and from this, a kernel can be derived that can be efficiently computed.<ref name="Vishwanathan"/>
 
Another examples is the '''Weisfeiler-Leman graph kernel'''<ref>Shervashidze, Nino, et al. "Weisfeiler-lehman graph kernels." Journal of Machine Learning Research 12.9 (2011).</ref> which computes multiple rounds of the Weisfeiler-Leman algorithm and then computes the similarity of two graphs as the inner product of the histogram vectors of both graphs. In those histogram vectors the kernel collects the number of times a color occurs in the graph in every iteration. For two isomorphic graphs, the kernel returns a maximal similarity since the two feature vectors are identical.