Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NNVM] Support argmax/argmin in tensorflow frontend #1514

Merged
merged 2 commits into from
Aug 2, 2018

Conversation

sergei-mironov
Copy link
Contributor

This PR aims to improve Tensorflow frontend by:

  • Adding argmax/argmin operations added recently
  • Tweaking TF nodes parser (use placeholders first, report all missing operations at once)

@sergei-mironov sergei-mironov force-pushed the add-argmax-tf branch 2 times, most recently from f2ea7ee to 0f3d7c4 Compare July 31, 2018 13:37
Copy link
Contributor

@srkreddy1238 srkreddy1238 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ArgMax & ArgMin are good to go.

Few suggestions on the main function changes to look into.


try:
from tensorflow.python.framework import tensor_util
except ImportError as e:
raise ImportError(
"Unable to import tensorflow which is required {}".format(e))

# Parse the nodes briefly to determine the preconditions:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Brief parsing is a good addition, Thanks. It helps to list all missing ops at one place.
Please make it a separate function instead of inline.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll move it to a separate function.

@@ -940,7 +976,7 @@ def from_tensorflow(self, graph):
"Please freeze the graph with add_shapes=True")
elif node.op == "Const":
# Assuming first Const node as Graph Input node
if self._input_node == '':
if self._input_node == '' and not has_placeholders:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically there is nothing does with _input_node. Hence _input_node has no significance here.
As you touched this area I suggest to change the below or ignore this part in this PR.

A graph can be executed as ConstA+ConstB=C.
where there is no place holders and we don't need to convert an Const to placeholder and expect input from user (Example test cases.).

Hence I suggest to simplify as "All placeholders are inputs" and "all Constants are Params".

Copy link
Contributor Author

@sergei-mironov sergei-mironov Aug 1, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Treating constants as inputs seemed strange to me too, so I am glad to see you suggestion.

I attempted to make this simplification and it seems that at least some operations implicitly depend on Consts being an input. For example, _conv operation uses _input_shape here . Input shape in its turn is set only for input nodes (i.e. nodes which aren't listed in params attribute, here is the key point) Thus, removing constants from inputs unfortunately breaks some tests involving convolution, since _input_shapes is no longer available for input[0].

I think it is a bug in _conv since any operation should AFAIK work with any type of input nodes. Let me suggest that I fix it in a separate commit. As for this commit, I'd better remove the placeholder tweak completely to let other changes go.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you could hold the placeholder change from this PR and let this go first.
We could open a discussion @ discuss.tvm.ai and discuss it further.

BTW, _input_shapes is a conclusion after a long discussion around SAME padding implementation of tensorflow convolution. This is primarily due to NNVM compilation process expects to build based on a fixed input shapes (which is different for tensorflow or other frameworks which accepts variable shapes at runtime).

_input_shape for constant is not a problem as it's a constant and shape is available from Tensor.

Copy link
Contributor Author

@sergei-mironov sergei-mironov Aug 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, postponing the placeholders part. I've started https://discuss.tvm.ai/t/importing-tensorflow-models-placeholder-vs-const-issues/561 topic for further discussion.

@sergei-mironov sergei-mironov force-pushed the add-argmax-tf branch 2 times, most recently from d50bf27 to fa69b93 Compare August 2, 2018 11:17
Copy link
Contributor

@srkreddy1238 srkreddy1238 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @grwlf .

LGTM.

@tqchen tqchen merged commit 7ee9cca into apache:master Aug 2, 2018
@tqchen
Copy link
Member

tqchen commented Aug 2, 2018

Thanks @grwlf @srkreddy1238 this is merged

tqchen pushed a commit to tqchen/tvm that referenced this pull request Aug 4, 2018
sergei-mironov added a commit to sergei-mironov/tvm that referenced this pull request Aug 8, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants