Graph neural networks have recently achieved remarkable success in representing graph-structured data, with rapid progress in both the node embedding and graph pooling methods. Yet, they mostly focus on capturing information from the nodes considering their connectivity, and not much work has been done in representing the edges, which are essential components of a graph. However, for tasks such as graph reconstruction and generation, as well as graph classification tasks for which the edges are important for discrimination, accurately representing edges of a given graph is crucial to the success of the graph representation learning. To this end, we propose a novel edge representation learning framework based on Dual Hypergraph Transformation (DHT), which transforms the edges of a graph into the nodes of a hypergraph. This dual hypergraph construction allows us to apply message-passing techniques for node representations to edges. After obtaining edge representations from the hypergraphs, we then cluster or drop edges to obtain holistic graph-level edge representations. We validate our edge representation learning method with hypergraphs on diverse graph datasets for graph representation and generation performance, on which our method largely outperforms existing graph representation learning methods. Moreover, our edge representation learning and pooling method also largely outperforms state-of-theart graph pooling methods on graph classification, not only because of its accurate edge representation learning, but also due to its lossless compression of the nodes and removal of irrelevant edges for effective message-passing. Code is available at https://github.com/harryjo97/EHGNN.
Axa Assurance Maroc - Insurer Innovation Award 2024
Edge Representation Learning with Hypergraphs
1. Edge Representation Learning with Hypergraphs
Jaehyeong Jo1*, Jinheon Baek1*, Seul Lee1*,
Dongki Kim1, Minki Kang1, Sung Ju Hwang1,2
(*: equal contribution)
KAIST1, AITRICS2, South Korea
2. 1
3
2
4
A B
C
D
𝑮𝑮 = ( 𝑿𝑿, 𝑴𝑴, 𝑬𝑬 ) Edge Feature
Node Feature
Incidence Matrix
1 2 3 4
A
B
C
D
A
B
C
D
1
2
3
4
Graphs: Nodes, Edges and Incidence
Graphs can be represented by the triplet:
Node feature, incidence matrix, and edge feature.
𝑿𝑿
𝑴𝑴
𝑬𝑬
3. Importance of Learning Edges
(a) Tylenol (Beneficial)
(b) NAPQI (Toxic) (c) Twitter network
Previous works focus on accurately representing the nodes, largely
overlooking edges which are essential components of a graph.
http://allthingsgraphed.com/2014/11/02/twitter-friends-network/
4. Learning Edge Representation via Nodes
Node-to-Node message passing
A B
C
D
1
2
3
4
Edge-to-Node message passing
Previous works have used edge features as auxiliary information to augment node
features, which implicitly capture the edge information in the node representations.
5. Edge HyperGraph Neural Network (EHGNN)
HyperCluster
HyperDrop
Input Graph
1
3
2
4
A B
C
D
B
A
C
D
1
2
3
4
Output Graph
Global Edge
Representation
Message
Passing
Node-to-Hyperedge
Edge-to-Node DHT
Dual Hypergraph Transformation (DHT)
Message
Passing
We propose a novel edge representation learning scheme using
Dual Hypergraph Transformation, and two edge pooling methods, namely
HyperCluster and HyperDrop.
6. Dual Hypergraph Transformation
Dual Hypergraph Transformation
Input Graph
𝑮𝑮 = (𝑿𝑿, 𝑴𝑴, 𝑬𝑬)
Dual Hypergraph
𝑮𝑮∗
= (𝑿𝑿∗
, 𝑴𝑴∗
, 𝑬𝑬∗
)
(a) (b)
(d) (c)
(a) Edge-to-Node
(d) Node-to-Edge
(b) Node-to-Hyperedge
(c) Hyperedge-to-Node
3
4
2
1 D
5
1
3
2
4
A B
C
D
5 5
1
3
2
4
A B
D
C
B
A
C
1
2
3
4
5
𝑬𝑬
A
𝑴𝑴
B
C
D
1 2 3 4 5
𝑿𝑿
A
B
C
D
𝑬𝑬∗
= 𝑿𝑿
A
B
C
D
A
𝑴𝑴∗
= 𝑴𝑴𝑻𝑻
B
C
D
1 2 3 4 5
1
2
3
4
5
𝑿𝑿∗
= 𝑬𝑬
We represent edges as nodes in a hypergraph, which allows us to apply any off-the-
shelf message-passing schemes designed for node-level representation learning.
7. Message-Passing on the Dual Hypergraph
Message-passing cost on the dual hypergraph is equal to the message-passing
cost on the original graph as 𝑶𝑶(𝑬𝑬).
DHT DHT
Message-Passing
Dual Hypergraph
We can perform message-passing between edges of a graph, by performing
message-passing between nodes of its dual hypergraph.
8. Edge Pooling: HyperCluster & HyperDrop
HyperCluster HyperDrop Output Graph
Global Edge
Representations
We propose two novel graph pooling methods to obtain compact graph-level
edge representations, namely Hypercluster and HyperDrop.
Representing each edge well alone is insufficient in obtaining an accurate
representation of the entire graph.
9. Experiments
• Graph Reconstruction
• Graph Generation
• Graph Classification
• Node Classification
: Generate a valid graph with desired properties.
: Reconstruct node and edge features of a given graph from their
pooled representations.
: Predict the label of a given graph.
: Predict the labels of the node of a given graph.
10. Graph Reconstruction
Figure: Graph reconstruction results on the ZINC molecule (left) and synthetic (right) datasets.
(a) Original
(v) R-GCN + GMPool
(b) MPNN + GMPool
(d) HyperCluster (Ours)
Accurately representing edges is crucial for graph reconstruction tasks. EHGNN with
HyperCluster yields incomparably high performance compared to the baselines.
11. Graph Reconstruction: Compression
Figure: Relative size of the representation after pooling to the original graph.
We validated the effectiveness of HyperCluster in dense graph compression,
in which our method is able to obtain highly compact but accurate representation.
12. Graph Generation
Figure: Graph generation results on MolGAN (left) and MARS (right).
EHGNN frameworks obtains significantly improved generation performance,
with both MolGAN and MARS architectures.
13. Graph Classification
Figure: Graph classification results on test sets.
EHGNN with HyperDrop outperforms all the hierarchical pooling baselines, and
when paired with GMT, obtains the best performance on most of the datasets.
14. Graph Classification: Examples
Figure: HyperDrop results on COLLAB dataset.
HyperDrop accurately identifies the task relevant edges, which leads to dividing
the large graph into connected components for effective message passing.
15. Node Classification
Figure: Node classification results on Cora (left) and Citeseer (right) datasets.
HyperDrop alleviates the over-smoothing problem of deep GNNs on semi-
supervised node classification tasks by identifying task relevant edges.
16. Conclusion
• We introduce a novel edge representation learning scheme using Dual Hypergraph
Transformation, which we can apply off-the-shelf message passing schemes
designed for node-level representation learning.
• We propose a novel edge pooling methods for graph-level representation learning,
to overcome the limitations of existing node-based pooling methods.
• We validate our methods on graph reconstruction, generation, and classification
tasks, on which we largely outperform existing graph representation learning
methods.