Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increment version to 1.2.2. #243

Closed
wants to merge 1 commit into from
Closed

Conversation

dblock
Copy link
Member

@dblock dblock commented Dec 15, 2021

Signed-off-by: dblock dblock@amazon.com

Description

Increment version to 1.2.2.

Check List

  • Commits are signed as per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Signed-off-by: dblock <dblock@amazon.com>
jmazanec15
jmazanec15 previously approved these changes Dec 15, 2021
@dblock
Copy link
Member Author

dblock commented Dec 15, 2021

329 tests completed, 2 failed
Tests with failures:
 - org.opensearch.knn.plugin.transport.RemoveModelFromCacheTransportActionTests.testNodeOperation_modelInCache
 - org.opensearch.knn.plugin.transport.RemoveModelFromCacheTransportActionTests.testNodeOperation_modelNotInCache

@naveentatikonda naveentatikonda dismissed their stale review December 15, 2021 16:07

Oops..Didn't check CI

@jmazanec15
Copy link
Member

Tried to repro locally but could not.

The test failure that is relevant is testNodeOperation_modelInCache. The other test carries state from this test (which needs to be fixed) and that causes the failure. I believe if testNodeOperation_modelInCache succeeds then the other one will as well.

I see someone kicked off another run. I am going to see what happens and continue looking into why testNodeOperation_modelInCache may have failed.

@vamshin
Copy link
Member

vamshin commented Dec 15, 2021

Tried to repro locally but could not.

The test failure that is relevant is testNodeOperation_modelInCache. The other test carries state from this test (which needs to be fixed) and that causes the failure. I believe if testNodeOperation_modelInCache succeeds then the other one will as well.

I see someone kicked off another run. I am going to see what happens and continue looking into why testNodeOperation_modelInCache may have failed.

@jmazanec15 I kicked another run to see if the issue is transient

@jmazanec15
Copy link
Member

These 2 tests appear to be flaky. From output:

  1> [2021-12-15T11:01:35,225][INFO ][o.o.k.p.t.RemoveModelFromCacheTransportActionTests] [testNodeOperation_modelInCache] before test
  1> [2021-12-15T11:01:35,231][INFO ][o.o.k.i.ModelCache       ] [testNodeOperation_modelInCache] [KNN] Model Cache evicted. Key test-model-id, Reason: EXPLICIT
  1> [2021-12-15T11:01:35,234][INFO ][o.o.k.p.t.RemoveModelFromCacheTransportActionTests] [testNodeOperation_modelInCache] after test

Model gets evicted. But cache may not update in time for assert.

Given that I am unable to reproduce failure locally, I think these cases can be commented out @dblock and marked as flaky.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants