-
Notifications
You must be signed in to change notification settings - Fork 864
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix/added unique ids to model_name #760
Conversation
current_time = time.strftime("%Y-%m-%d_%H:%M:%S", time.localtime()) | ||
param_combination_dict[ | ||
"model_name" | ||
] = f"{current_time}_{param_combination_dict['model_name']}_{os.getpid()}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about using something that allows mapping back to the parameters combination being used? Maybe instead of the current time, using the index of the currently evaluated parameters within params_cross_product
would make this easier? Also do we need the process PID?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I followed the current naming convention.
darts/darts/models/forecasting/torch_forecasting_model.py
Lines 180 to 182 in 96125f8
if model_name is None: | |
current_time = datetime.datetime.now().strftime("%Y-%m-%d_%H.%M.%S.%f") | |
model_name = current_time + "_torch_model_run_" + str(os.getpid()) |
The naming was discussed in #346
The params_cross_product
is not provided to the user, so it's unclear how useful this would be. The name also needs to be unique across consecutive runs; otherwise, multiple events files will be in the same folder again. So a simple index would not surface. We could remove the PID and just use the current time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The hyperparameters could be saved to tensorboard to map them back quickly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, so let's leave it as you propose, but without the PID, OK?
Also, could you unit-test this? Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I updated the naming convention. I'm not sure its worth the time to unit test this since hyperparameter tuning will completely change with pytorch-lightning. I could try to do it, but work has really picked up this week.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, np :) Thanks for the fix!
Codecov Report
@@ Coverage Diff @@
## master #760 +/- ##
==========================================
- Coverage 90.80% 90.77% -0.03%
==========================================
Files 67 67
Lines 6752 6756 +4
==========================================
+ Hits 6131 6133 +2
- Misses 621 623 +2
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Fixes #759
Summary
added date and pid to name
Other Information