Friday, November 27, 2020

graph neural networks

 

"Graph neural network based methods such as GraphSAGE (Hamilton et al., 2017a) typically define a unique computational

graph for each node, allowing it to perform efficient information aggregation for nodes with different degrees."


Graph Attention Network (GAT) (Veliˇckovi´c et al., 2017) utilizes a self-attention mechanism in the information

aggregation process. Motivated by these properties, we propose our method Hyper-SAGNN based on the self-attention mechanism within each

tuple to learn the function f.



https://www.kdnuggets.com/2019/08/neighbours-machine-learning-graphs.html


Graph Convolutional Network (GCN) [Kipf2016]



"The normalised adjacency matrix encodes the graph structure and upon multiplication with the design matrix effectively smooths a node’s feature vector based on those of its immediate neighbours in the graph. A’ is normalised such that each neighbouring node’s contribution is proportional to how connected that node is in the graph."

"The layer definition is completed by the application of an element-wise non-linear function, e.g., ReLu, to A’FW+b. The output matrix of this layer can be used as input to another GCN layer or any other type of neural network layer, allowing the creation of deep neural architectures able to learn a complex hierarchy of node features needed for the downstream node classification task."

"Training a 2-layer GCN model (done in this script using our open-source Python library StellarGraph) with 32 output units per layer on the Cora dataset with just 140 training node labels seen by the model results in a considerable boost in classification accuracy when compared to the baseline 2-layer MLP. Accuracy on predicting the subject of a hold-out test set of papers increases to approximately 81% — an improvement of 21% over the MLP that only uses the BoW node features and ignores citation relationships between the papers. This clearly demonstrates that at least for some datasets utilising relationship information in the data can significantly boost performance in a predictive task."





Reference: 
Kipf, T. N., & Welling, M. (2016). “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907.

https://tkipf.github.io/graph-convolutional-networks/

https://github.com/stellargraph/stellargraph



No comments:

Post a Comment