[Distributed] Specify the graph format for distributed training #2948
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Previously, the graph structure in each partition is stored as CSC. When users call APIs such as find_edges and out_degrees, DGL generates corresponding graph format for these APIs. In the setting of distributed training, the graph structure is shared among all trainers and servers. If an API triggers the construction of another graph format, the graph structure of the new format is stored in local memory. If every trainer and server in a machine does the same thing, a graph will be replicated by many times in the machine.
This PR allows users to specify the graph formats during the launch time. Once the graph format is constructed, any graph API can no longer construct a new graph format during the runtime. If a graph format is required but not created during the launch time, an error is reported.
Checklist
Please feel free to remove inapplicable items for your PR.
or have been fixed to be compatible with this change
Changes