Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/remove shared ptr #3538

Merged
merged 10 commits into from
Aug 17, 2017

Conversation

reyoung
Copy link
Collaborator

@reyoung reyoung commented Aug 16, 2017

This PR is the second step of issue #3520.
This PR is following PR #3522 . Please Review #3522 First.

@@ -91,7 +89,7 @@ std::shared_ptr<OperatorBase> BackwardRecursive(
}

// Returned gradient network
auto net = std::make_shared<operators::NetOp>();
auto net = std::unique_ptr<operators::NetOp>(new operators::NetOp());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we have a make_unique helper ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, it could be in another PR. make_unique is not in C++ 11 but in C++ 14. We can implement it in C++ 11.

protected:
std::string type_;
// NOTE: in case of OpGrad, inputs_ contains:
// I (Inputs)
// I (Inputs)opear
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is "opear" mean?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo here

}

void InsertOp(size_t pos, const std::shared_ptr<OperatorBase>& op) {
void InsertOp(size_t pos, framework::OperatorBase* op, bool own) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's the case when a net does't own an inserted op?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It means that op is owned by another place (other network or owned by Python).

In that situation, the op will be Cloned first.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In caffe2's design, every net own its ops? If same op is added, just clone/(create new and copy) it.

If an op belongs to python, python will get a unique_ptr to help control the memory. When net want to add a same op, just clone (create new and copy) it simpler?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Every NetOp own operators in this design. The own parameter actually means is this pointer could be moved into NetOp and NetOp will take care that pointer's life cycle.

Maybe that API is confusing. It is removed now.

// Macro for define a clone method.
// If you are writing an kernel operator, `Clone` will be defined when you
// register it. i.e. `Clone` method is not needed to define by yourself.
#define DEFINE_OP_CLONE_METHOD(CLS) \
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

registry里边的macro的参数用了 xx_yy这种命名风格,这里是不是应该保持一致?比如叫做 op_class

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

那是因为registry里面有很多class同时在一个macro,比如op_class, op_type, grad_class, etc.

// You can also use
// using PARENT_CLASS::PARENT_CLASS;
// to use parent's constructor.
#define DEFINE_OP_CONSTRUCTOR(CLS, PARENT_CLS) \
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的命名同上,我看了下caffe2和tensorflow的代码,都是使用了 xx_yy这种方法命名宏的参数,所以要不咱们也按照这个规范来?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这并没有在Google C++ code style中有描述。

class NOP : public OperatorBase {
public:
using OperatorBase::OperatorBase;
void InferShape(const Scope& scope) const override {}
void Run(const Scope& scope,
const platform::DeviceContext& dev_ctx) const override {}
std::unique_ptr<OperatorBase> Clone() const override {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里为啥没有统一使用DEFINE_OP_CLONE_METHOD?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DEFINE_OP_CLONE_METHOD是一个只会被最子类使用的macro,这个函数被标记为final了。
这里,有一些类型会继承NOP,所以不能将Clone标记为final

* add_op could take a unique_ptr or a const reference. If unique_ptr is
  taken, the NetOp will take care of that operator's life cycle. If a
  const reference is taken, that op will be Cloned.
Copy link
Contributor

@Superjomn Superjomn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@reyoung reyoung merged commit 1808f88 into PaddlePaddle:develop Aug 17, 2017
@reyoung reyoung deleted the feature/remove_shared_ptr branch October 2, 2017 18:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants