PyTorch Autograd Wrappers#
Simple Neighborhood Aggregator (SAGEConv)#
|
PyTorch autograd function for simple aggregation using node features in an node-to-node reduction (n2n) while concatenating the original features of output at the end (agg_concat). |
Graph Attention (GATConv/GATv2Conv)#
|
PyTorch autograd function for a multi-head attention layer (GAT-like) without using cudnn (mha_gat) in a node-to-node reduction (n2n). |
|
PyTorch autograd function for a multi-head attention layer (GAT-like) without using cudnn (mha_gat_v2) with an activation prior to the dot product but none afterwards in a node-to-node reduction (n2n). |
Heterogenous Aggregator using Basis Decomposition (RGCNConv)#
|
PyTorch autograd function for node-to-node RGCN-like basis regularized aggregation, with features being transformed after (post) this aggregation. |
Update Edges: Concatenation or Sum of Edge and Node Features#
|
PyTorch autograd function for creating new edge features (update_efeat) based on either concatenating or summing edge features and the features of the corresponding source and destination node of each edge in an edge-to-edge fashion (e2e). |
|
PyTorch autograd function for creating new edge features (update_efeat) based on either concatenating or summing edge features and the features of the corresponding source and destination node of each edge in an edge-to-edge fashion (e2e). |