Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

out of memory if don`t fix VGG16 param #87

Open
aRookieMan opened this issue Dec 7, 2018 · 1 comment
Open

out of memory if don`t fix VGG16 param #87

aRookieMan opened this issue Dec 7, 2018 · 1 comment

Comments

@aRookieMan
Copy link

aRookieMan commented Dec 7, 2018

If dont fix pretrained VGG16s parameters , the model will increasing untill memory booms.
But if fix these parameters , it works.
Why?

code in train.py

for param in net.rpn.features.parameters():
    param.requires_grad = False
@aRookieMan aRookieMan changed the title all the input array dimensions except for the concatenation axis must match exact out of memory if don`t fix VGG16 param Dec 7, 2018
@SmallSmallQiu
Copy link

谢谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants