Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add performance tests to AWS SDK Extension #243

Conversation

NathanielRN
Copy link
Contributor

@NathanielRN NathanielRN commented Dec 10, 2020

Description

Follow up to the Core repo PR which adds a way to add performance tests: open-telemetry/opentelemetry-python#1443

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

Running tox -e test-sdkextension-aws locally I can produce the following results:

test-sdkextension-aws run-test: commands[0] | pytest
================================================================================================================================================= test session starts =================================================================================================================================================
platform darwin -- Python 3.8.2, pytest-6.1.2, py-1.9.0, pluggy-0.13.1 -- /Users/enowell/git/opentelemetry-python-contrib/.tox/test-sdkextension-aws/bin/python
cachedir: .tox/test-sdkextension-aws/.pytest_cache
benchmark: 3.2.3 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /Users/enowell/git/opentelemetry-python-contrib, configfile: pytest.ini
plugins: benchmark-3.2.3
collected 24 items

performance/benchmarks/trace/test_benchmark_aws_xray_ids_generator.py::test_generate_xray_trace_id PASSED                                                                                                                                                                                                       [  4%]
performance/benchmarks/trace/test_benchmark_aws_xray_ids_generator.py::test_generate_xray_span_id PASSED                                                                                                                                                                                                        [  8%]
performance/benchmarks/trace/propagation/test_benchmark_aws_xray_format.py::test_extract_single_header PASSED                                                                                                                                                                                                   [ 12%]
performance/benchmarks/trace/propagation/test_benchmark_aws_xray_format.py::test_inject_empty_context PASSED

...

-------------------------------------------------------------------------------------------------- benchmark: 4 tests -------------------------------------------------------------------------------------------------
Name (time in ns)                      Min                    Max                  Mean                StdDev                Median                 IQR            Outliers  OPS (Kops/s)            Rounds  Iterations
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
test_generate_xray_span_id        300.2500 (1.0)       4,729.1500 (1.0)        358.2244 (1.0)        117.4173 (1.0)        329.7750 (1.0)       66.6000 (1.42)    2505;2451    2,791.5467 (1.0)      105608          20
test_generate_xray_trace_id       576.0000 (1.92)     58,391.0000 (12.35)      765.5374 (2.14)       609.1449 (5.19)       739.0000 (2.24)      47.0000 (1.0)    1241;13007    1,306.2719 (0.47)     102533           1
test_inject_empty_context       1,379.0000 (4.59)     59,260.0000 (12.53)    1,899.8021 (5.30)     1,099.5063 (9.36)     1,872.0000 (5.68)     305.0000 (6.49)    1598;2308      526.3706 (0.19)      65433           1
test_extract_single_header      8,246.0000 (27.46)    29,191.0000 (6.17)     9,066.2906 (25.31)    2,448.8693 (20.86)    8,669.0000 (26.29)    301.5000 (6.41)          3;8      110.2987 (0.04)        117           1
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Legend:
  Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.
  OPS: Operations Per Second, computed as 1 / Mean

Does This PR Require a Core Repo Change?

  • No.

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
    - [ ] Changelogs have been updated
    - [ ] Unit tests have been added
  • Documentation has been updated

@NathanielRN NathanielRN requested review from a team, toumorokoshi and aabmass and removed request for a team December 10, 2020 01:50
@NathanielRN
Copy link
Contributor Author

@codeboten @lzchen Would it be possible for us to get a gh-pages branch on the OpenTelemetry-Contrib repo? :)

@NathanielRN NathanielRN force-pushed the add-aws-sdk-extension-performance-tests branch from f427f9c to 49d9390 Compare December 10, 2020 01:55
Copy link
Member

@toumorokoshi toumorokoshi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool! interested to see this in action.

@@ -49,6 +49,12 @@ See
[`tox.ini`](https://github.com/open-telemetry/opentelemetry-python-contrib/blob/master/tox.ini)
for more detail on available tox commands.

### Benchmarks

Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results.
Performance benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results.

Throughput I'm a bit confused about. What does that mean? number of iterations over some time period?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggesting against "progression" as I think that implies that we are trying to trend benchmarks one direction or another. I don't think we're consciously trying to improve performance, but we are trying to monitor it so it doesn't get out of control.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I understand, pytest-benchmark runs the test as many times as it can in a second (this is configurable). So the graph this action outputs shows how many times it was able to run this function (in the mean case) per second as a measure of performance and shows how this metric changes between commits.

image

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@toumorokoshi That makes sense! What if we called it "history" though to confirm that this is with respects to commits (and not a load test or something that would have "time" on the x access)?

Suggested change
Performance progression of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results.
Performance history of benchmarks for packages distributed by OpenTelemetry Python can be viewed as a [graph of throughput vs commit history](https://open-telemetry.github.io/opentelemetry-python-contrib/benchmarks/index.html). From this page, you can download a JSON file with the performance results.

.gitignore Outdated
@@ -35,6 +35,9 @@ coverage.xml
.cache
htmlcov

# Benchmarks
*.json
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as core PR. Otherwise LGTM! 🚢

@NathanielRN NathanielRN force-pushed the add-aws-sdk-extension-performance-tests branch 2 times, most recently from 1b512e9 to ced5b69 Compare December 11, 2020 02:46
@aabmass aabmass added the Skip Changelog PRs that do not require a CHANGELOG.md entry label Dec 11, 2020
Copy link
Contributor

@codeboten codeboten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding these, i've enabled the GH pages for this repo

@NathanielRN
Copy link
Contributor Author

@codeboten Thanks for enabling gh-pages! Unfortunately I don’t think it will show up because we decided in open-telemetry/opentelemetry-python#1443 to just push to master because that’s where the read the docs build is... so I’ll work on adding a read the docs build that uses root on Contrib today so we can see the graphs.

@NathanielRN NathanielRN force-pushed the add-aws-sdk-extension-performance-tests branch from 0e69d10 to 9ac09ef Compare December 11, 2020 17:21
Co-authored-by: Aaron Abbott <aaronabbott@google.com>
@NathanielRN NathanielRN force-pushed the add-aws-sdk-extension-performance-tests branch 2 times, most recently from 1ba5c64 to 43804d2 Compare December 11, 2020 17:23
@codeboten codeboten merged commit 91bfc9a into open-telemetry:master Dec 11, 2020
@NathanielRN NathanielRN deleted the add-aws-sdk-extension-performance-tests branch July 21, 2021 16:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Skip Changelog PRs that do not require a CHANGELOG.md entry
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants