Presentation on how to chat with PDF using ChatGPT code interpreter
NS-CUK Seminar: V.T.Hoang, Review on "Gophormer: Ego-Graph Transformer for Node Classification", CoRR 2021
1. Van Thuy Hoang
Dept. of Artificial Intelligence,
The Catholic University of Korea
hoangvanthuy90@gmail.com
2. 2
Contributions
propose to use Node2Seq that converts graph data to egograph-
based sequential input for transformers
propose a novel model Gophormer. Gophormer utilizes Node2Seq to
generate input sequential data and encodes it with proximity-
enhanced transformer
Extensive experiments are conducted on six benchmark datasets.
3. 3
Ego network
Ego network is a special type of network consisting of one central
node and all other nodes directly connected to it.
The central node is known as ego, while the other surrounding
nodes directly connected to it are known as alters.
5. 5
Node2Seq
a sampling strategy focusing on high-order neighbors could be a
good choice for heterophilic graph
6. 6
Proximity-Enhanced Transformer
The proximity-enhanced attention score:
The proximity encoding is calculated by 𝑀 structural encoding
functions defined as:
. The proximity encoding functions are defined a
10. 10
CONCLUSION
Gophormer, which effectively incorporates the structural information
by Node2Seq module and the proximity-enhanced attention
mechanism.
design consistency regularization loss and multi-sample inference to
alleviate the negative impacts of sampling