Treffer: Incorporating self-attentions into robust spatial-temporal graph representation learning against dynamic graph perturbations.
Weitere Informationen
This paper proposes a Robust Spatial-Temporal Graph Neural Network (RSTGNN), which overcomes the limitations faced by graph-based models against dynamic graph perturbations using robust spatial-temporal self-attentions to learn dynamic graph embeddings. In the RSTGNN model training, a selective spatial self-attention technique is employed to aggregate neighboring information based on projected node similarity, which reduces attention weights of edges with less similarity, enabling better information aggregation and preventing the model from ignoring spatial-temporal information. The temporal self-attention layer in the RSTGNN model intensifies temporal patterns using time-span-limited temporal attention weights. Additionally, the model uses a spatial-temporal loss function that penalizes nodes and edges most likely perturbed to alleviate the influence of dynamic graph perturbation. Specifically, the spatial loss focuses on attention weights associated with high-degree and potentially-attacked nodes, while the temporal loss targets attention weights of high centrality-varied nodes to prevent nodes from experiencing excessive centrality changes. To verify the effectiveness of our approach, we evaluate RSTGNN compared with other graph-based models under different node-based or edge-based perturbation rates. Results demonstrate that RSTGNN maintains high effectiveness in dynamic node classification and link prediction for five real dynamic graph datasets. [ABSTRACT FROM AUTHOR]
Copyright of Computing is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Volltext ist im Gastzugang nicht verfügbar. Login für vollen Zugriff.