JSAI2022

Presentation information

Organized Session

Organized Session » OS-22

[1G4-OS-22a] シミュレーションとAI(1/2)

Tue. Jun 14, 2022 2:20 PM - 3:40 PM Room G (Room G)

オーガナイザ:鷲尾 隆(大阪大学)[現地]、山崎 啓介(産業技術総合研究所)、山田 聡(BIRD INITIATIVE)、森永 聡(日本電気)、長尾 大道(東京大学)、吉田 亮(統計数理研究所)

3:00 PM - 3:20 PM

[1G4-OS-22a-03] A Proposal of Multi-Layer Perceptron with Graph Gating Unit for Graph Representation Learning and its Application to Surrogate Model for FEM

〇Yu Nakai1, Hiroshi Okuda1 (1. The University of Tokyo)

Keywords:GNN, FEM, gMLP, Surrogate Model, Over-Smoothing

GNNs are the neural networks for the representation learning of graph-structured data, most of which are constructed by stacking graph convolutional layers. As stacking n-layers of ones is equivalent to propagating n-hop of neighbor nodes’ information, GNNs require enough large number of layers to learn large graphs. However, it tends to degrade the model performance due to the problem called over-smoothing. In this paper, by presenting a novel GNN model, based on stacking feedforward neural networks with gating structures using GCNs, I tried to solve the over-smoothing problem and thereby overcome the difficulty of GNNs learning large graphs. The experimental results showed that the proposed method monotonically improved the prediction accuracy up to 20 layers without over-smoothing, whereas the conventional method caused it at 4 to 8 layers. In two experiments on large graphs, the PPI dataset, a benchmark for inductive node classification, and the application to the surrogate model for finite element methods, the proposed method achieved the highest accuracy of the existing methods compared, especially with a state-of-the-art accuracy of 99.71% on the PPI dataset.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password