Node2vec: Difference between revisions

Content deleted Content added
Iking5 (talk | contribs)
m See also: Linking GNN page for more graph ML info
m top: clean up, replaced: : → :
Line 1:
'''node2vec''' is an algorithm to generate vector representations of nodes on a [[Graph theory|graph.]] The ''node2vec'' framework learns low-dimensional representations for nodes in a graph through the use of [[random walk]]s through a graph starting at a target node. It is useful for a variety of [[machine learning]] applications. Besides reducing the engineering effort, representations learned by the algorithm lead to greater predictive power.<ref>{{Cite web|url=https://snap.stanford.edu/node2vec/|title=node2vec: Scalable Feature Learning for Networks}}</ref> ''node2vec'' follows the intuition that random walks through a graph can be treated like sentences in a corpus. Each node in a graph is treated like an individual word, and a random walk is treated as a sentence. By feeding these "sentences" into a [[N-gram|skip-gram]], or by using the [[Bag-of-words model|continuous bag of words]] model paths found by random walks can be treated as sentences, and traditional data-mining techniques for documents can be used. The algorithm generalizes prior work which is based on rigid notions of network neighborhoods, and argues that the added flexibility in exploring neighborhoods is the key to learning richer representations of nodes in graphs.<ref>{{cite journal|last1=Grover|first1=Aditya|last2=Leskovec|first2=Jure|year=2016|title=node2vec: Scalable Feature Learning for Networks|journal=KDD : Proceedings. International Conference on Knowledge Discovery & Data Mining|volume=2016|pages=855–864|doi=10.1145/2939672.2939754|pmid=27853626|pmc=5108654|arxiv=1607.00653|bibcode=2016arXiv160700653G}}</ref>
 
'''node2vec''' is an algorithm to generate vector representations of nodes on a [[Graph theory|graph.]] The ''node2vec'' framework learns low-dimensional representations for nodes in a graph through the use of [[random walk]]s through a graph starting at a target node. It is useful for a variety of [[machine learning]] applications. Besides reducing the engineering effort, representations learned by the algorithm lead to greater predictive power.<ref>{{Cite web|url=https://snap.stanford.edu/node2vec/|title=node2vec: Scalable Feature Learning for Networks}}</ref> ''node2vec'' follows the intuition that random walks through a graph can be treated like sentences in a corpus. Each node in a graph is treated like an individual word, and a random walk is treated as a sentence. By feeding these "sentences" into a [[N-gram|skip-gram]], or by using the [[Bag-of-words model|continuous bag of words]] model paths found by random walks can be treated as sentences, and traditional data-mining techniques for documents can be used. The algorithm generalizes prior work which is based on rigid notions of network neighborhoods, and argues that the added flexibility in exploring neighborhoods is the key to learning richer representations of nodes in graphs.<ref>{{cite journal|last1=Grover|first1=Aditya|last2=Leskovec|first2=Jure|year=2016|title=node2vec: Scalable Feature Learning for Networks|journal=KDD : Proceedings. International Conference on Knowledge Discovery & Data Mining|volume=2016|pages=855–864|doi=10.1145/2939672.2939754|pmid=27853626|pmc=5108654|arxiv=1607.00653|bibcode=2016arXiv160700653G}}</ref>
The algorithm is considered one of the best graph classifiers.<ref>{{cite journal|arxiv=1903.07902|doi=10.1109/tkde.2019.2951398|title=A Comparative Study for Unsupervised Network Representation Learning|year=2020|last1=Khosla|first1=Megha|last2=Setty|first2=Vinay|last3=Anand|first3=Avishek|journal=IEEE Transactions on Knowledge and Data Engineering|page=1|s2cid=207870054}}</ref>
 
==See also==
* [[Struc2vec]]