Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc Fix] fix the format of gt doc #6949

Merged
merged 1 commit into from
Jan 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/source/graphtransformer/data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ In this section, we will prepare the data for the Graphormer model introduced be


.. code:: python

def collate(graphs):
# compute shortest path features, can be done in advance
for g in graphs:
Expand Down
6 changes: 3 additions & 3 deletions docs/source/graphtransformer/index.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
🆕 Tutorial: GraphTransformer
🆕 Tutorial: Graph Transformer
==========

This tutorial introduces the **graphtransformer** module, which is a set of
utility modules for building and training graph transformer models.
This tutorial introduces the **graph transformer** (:mod:`~dgl.nn.gt`) module,
which is a set of utility modules for building and training graph transformer models.

.. toctree::
:maxdepth: 2
Expand Down
7 changes: 6 additions & 1 deletion docs/source/graphtransformer/model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ Degree Encoding
The degree encoder is a learnable embedding layer that encodes the degree of each node into a vector. It takes as input the batched input and output degrees of graph nodes, and outputs the degree embeddings of the nodes.

.. code:: python

degree_encoder = dgl.nn.DegreeEncoder(
max_degree=8, # the maximum degree to cut off
embedding_dim=512 # the dimension of the degree embedding
Expand All @@ -22,6 +23,7 @@ Path Encoding
The path encoder encodes the edge features on the shortest path between two nodes to get attention bias for the self-attention module. It takes as input the batched edge features in shape and outputs the attention bias based on path encoding.

.. code:: python

path_encoder = PathEncoder(
max_len=5, # the maximum length of the shortest path
feat_dim=512, # the dimension of the edge feature
Expand All @@ -33,6 +35,7 @@ Spatial Encoding
The spatial encoder encodes the shortest distance between two nodes to get attention bias for the self-attention module. It takes as input the shortest distance between two nodes and outputs the attention bias based on spatial encoding.

.. code:: python

spatial_encoder = SpatialEncoder(
max_dist=5, # the maximum distance between two nodes
num_heads=8, # the number of attention heads
Expand All @@ -46,6 +49,7 @@ The Graphormer layer is like a Transformer encoder layer with the Multi-head Att
We can stack multiple Graphormer layers as a list just like implementing a Transformer encoder in PyTorch.

.. code:: python

layers = th.nn.ModuleList([
GraphormerLayer(
feat_size=512, # the dimension of the input node features
Expand All @@ -63,6 +67,7 @@ Model Forward
Grouping the modules above defines the primary components of the Graphormer model. We then can define the forward process as follows:

.. code:: python

node_feat, in_degree, out_degree, attn_mask, path_data, dist = \
next(iter(dataloader)) # we will use the first batch as an example
num_graphs, max_num_nodes, _ = node_feat.shape
Expand All @@ -84,6 +89,6 @@ Grouping the modules above defines the primary components of the Graphormer mode
attn_bias=attn_bias,
)

For simplicity, we omit some details in the forward process. For the complete implementation, please refer to the `Graphormer example <https://github.com/dmlc/dgl/tree/master/examples/core/Graphormer`_.
For simplicity, we omit some details in the forward process. For the complete implementation, please refer to the `Graphormer example <https://github.com/dmlc/dgl/tree/master/examples/core/Graphormer>`_.

You can also explore other `utility modules <https://docs.dgl.ai/api/python/nn-pytorch.html#utility-modules-for-graph-transformer>`_ to customize your own graph transformer model. In the next section, we will show how to prepare the data for training.