Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sample contrastive #167

Merged
merged 3 commits into from
Jul 24, 2024
Merged

Sample contrastive #167

merged 3 commits into from
Jul 24, 2024

Conversation

danielward27
Copy link
Owner

Previously in ContrastiveLoss, we would take the first n_contrastive samples (excluding the joint index) as the contrastive samples, and rely on shuffling between batches to provide randomness. However, this still leads to correlated contrastive samples within a batch. This pull request fixes that to instead sample the contrastive samples from the remainder of the batch. For convenience, we update MaximumLikelihoodLoss to also accept a key, for consistency with the updated ContrastiveLoss.

This could be a breaking change for users with a custom loss function used with fit_to_data, which now expects the loss function to allow passing of a key, but is trivial to fix by allowing passing of a key (even if ignored).

@danielward27 danielward27 merged commit eeb3481 into main Jul 24, 2024
1 check passed
@danielward27 danielward27 deleted the sample_contrastive branch July 24, 2024 14:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant