-
Notifications
You must be signed in to change notification settings - Fork 502
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(pt): finetuning property/dipole/polar/dos fitting with multi-dimensional data causes error #4145
base: devel
Are you sure you want to change the base?
Conversation
WalkthroughWalkthroughThe pull request modifies the logic in the Changes
Possibly related PRs
Suggested reviewers
Recent review detailsConfiguration used: CodeRabbit UI Files selected for processing (1)
Files skipped from review as they are similar to previous changes (1)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for more information, see https://pre-commit.ci
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## devel #4145 +/- ##
==========================================
+ Coverage 83.42% 83.44% +0.02%
==========================================
Files 532 532
Lines 52048 52049 +1
Branches 3046 3046
==========================================
+ Hits 43419 43432 +13
+ Misses 7682 7672 -10
+ Partials 947 945 -2 ☔ View full report in Codecov by Sentry. |
deepmd/pt/train/training.py
Outdated
(".fitting_net." in item_key) | ||
or (".out_bias" in item_key) | ||
or (".out_std" in item_key) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should be ".descriptor" not in item_key
. When the model has other variables in the future, is it expected to keep it or replace it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In current version, the two ways of writing ".descriptor" not in item_key
and (".fitting_net." in item_key) or (".out_bias" in item_key) or (".out_std" in item_key)
are fully equivalent. Please @iProzd take a look at this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think ".descriptor." not in item_key
is better. Yet @Chengqian-Zhang please check and ensure that they are equivalent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I an sure that in current version, if "descriptor" and "fitting" do not in target_keys, only "out_bias" and "out_std" remain.
Fix issue #4108
If a pretrained model is labeled with energy and the
out_bias
is one dimension. If we want to finetune a dos/polar/dipole/property model using this pretrained model, theout_bias
of finetuning model is multi-dimension(example: numb_dos = 250). An error occurs:RuntimeError: Error(s) in loading state_dict for ModelWrapper:
size mismatch for model.Default.atomic_model.out_bias: copying a param with shape torch.Size([1, 118, 1]) from checkpoint, the shape in current model is torch.Size([1, 118, 250]).
size mismatch for model.Default.atomic_model.out_std: copying a param with shape torch.Size([1, 118, 1]) from checkpoint, the shape in current model is torch.Size([1, 118, 250]).
When using new fitting, old out_bias is useless because we will recompute the new bias in later code. So we do not need to load old out_bias when using new fitting finetune.
Summary by CodeRabbit
New Features
Tests