Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix s3boto/s3boto3 memory leak introduced in #504 #506

Conversation

jnm
Copy link
Contributor

@jnm jnm commented Jun 4, 2018

by truncating the buffer after uploading it. This removes most of the code from #504; let me know if you'd prefer to see cb2e876 reverted.

by truncating the buffer after uploading it. Follows the approach of jschneier#169.
@codecov-io
Copy link

codecov-io commented Jun 4, 2018

Codecov Report

Merging #506 into master will decrease coverage by 0.11%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #506      +/-   ##
==========================================
- Coverage   76.44%   76.32%   -0.12%     
==========================================
  Files          11       11              
  Lines        1592     1584       -8     
==========================================
- Hits         1217     1209       -8     
  Misses        375      375
Impacted Files Coverage Δ
storages/backends/s3boto3.py 86.96% <100%> (-0.16%) ⬇️
storages/backends/s3boto.py 87.87% <100%> (-0.17%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update cb2e876...7aa9b73. Read the comment docs.

@sww314 sww314 added the s3boto label Jun 9, 2018
@jnm
Copy link
Contributor Author

jnm commented Jun 19, 2018

@jschneier
Copy link
Owner

Thanks for the poke. Opened the rebased version of this at #546.

@jschneier jschneier closed this Aug 12, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants