Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot request out_edges() for empty node sets on cuda #2128

Closed
chwan1016 opened this issue Aug 30, 2020 · 4 comments
Closed

cannot request out_edges() for empty node sets on cuda #2128

chwan1016 opened this issue Aug 30, 2020 · 4 comments
Labels
bug:confirmed Something isn't working

Comments

@chwan1016
Copy link
Contributor

chwan1016 commented Aug 30, 2020

🐛 Bug

I'm using the latest DGL version. The following code throws a DGLError: Check failed: dim != 0 (0 vs. 0) :

import dgl
import torch

g = dgl.DGLGraph(([0, 1], [1, 2])).to('cuda:0')
idx = torch.LongTensor([]).to('cuda:0')
g.out_edges(idx)

However, if the graph runs on CPU, the code works well. I'm not sure whether this is a bug.

@chwan1016 chwan1016 changed the title cannot add empty edge sets for graph on cuda cannot request out_edges() for empty node sets on cuda Aug 30, 2020
@VoVAllen
Copy link
Collaborator

Empty tensor should not be allowed in out_edges function. Do you have any expected behavior here?

@chwan1016
Copy link
Contributor Author

Actually, I just feel it's strange that out_edges has different behavior on CPU and GPU. It would be better if it returns an empty edge set.

@jermainewang jermainewang added the bug:confirmed Something isn't working label Aug 31, 2020
@chwan1016
Copy link
Contributor Author

A similar problem is found in out_degrees

@BarclayII
Copy link
Collaborator

Should be fixed in master. Will go into 0.5.2 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug:confirmed Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants