-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add job start and end time #30
base: main
Are you sure you want to change the base?
Conversation
1df3848
to
0090c20
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice!
Great to see some cleanup too (though some comments).
I still need to get used to the idea of introducing typing here. Maybe discuss this first at a standup, and then perhaps introduce later?
test_api.py
Outdated
assert jobinfo['project'] == RUN_PROJECT | ||
assert jobinfo['spider'] == RUN_SPIDER | ||
assert jobinfo['state'] == 'finished' | ||
assert datetime.strptime(jobinfo['start_time'], TIME_FORMAT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather keep the previous form, so that if anything is added to the dict, you are aware that a test needs changing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will update this. Would need to look a bit more into mocking the start_time
and end_time
of the jobs during tests, since a dict comparison will require the exact start_time
and end_time
of the container/pods. Or maybe a different solution.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, that makes sense.
What about removing the time fields from the dict under test, and testing these separately? That would both allow us to test the full response, and allow proper time comparison.
Or use a time-freezing testing library (there must exist one).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried using freezegun
to set a specific time during testing, but since we are talking to the kubernetes/docker APIs and not using the datetime module, this solution did not work. I will do a bit more digging. The first solution should also work though, but it might be better to see if there are alternative solutions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, that totally makes sense! I'm not for mocking the result in the component under test, so let's fix this in the testing part.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good, thanks!
I see you've added the first unit test. Great!
The API test is not a regular test, as it is meant to run against a running instance. I'm not really sure if it fits well into the general pytest infrastructure like this. What do you think?
I'd split the API tests (which can be run completely separate from scrapyd-k8s, and could even be run against a different implementation, iirc), and the unit tests, into separate CI jobs.
Resolves #11