Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: trainer for non trainable model #239

Merged
merged 1 commit into from
Jan 25, 2022

Conversation

YuriCat
Copy link
Contributor

@YuriCat YuriCat commented Jan 21, 2022

This is just a bug, and we don't need any Batcher if self.optimizer is None.

@@ -349,8 +349,8 @@ def shutdown(self):

def train(self):
if self.optimizer is None: # non-parametric model
print()
return
time.sleep(0.1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this needed?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is because without time.sleep(), train() will be called incessantly.
It may be better to prevent calling this function.

Copy link
Contributor Author

@YuriCat YuriCat Jan 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that the training epoch will not proceed until train() is called.

@ikki407
Copy link
Member

ikki407 commented Jan 25, 2022

LGTM

@ikki407 ikki407 merged commit dc6410f into DeNA:develop Jan 25, 2022
@YuriCat YuriCat deleted the fix/trainer_for_non_trainable_model branch February 11, 2022 17:57
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants