Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chunkwise netcdf export #1092

Closed
wants to merge 5 commits into from

Conversation

nvogtvincent
Copy link
Contributor

Suggestion (WIP) for automatically 'chunking' netcdf export step if intermediate variables would exceed available system memory. Importantly, the code below currently assumes that the number of unique time steps in the first chunk == number of unique time steps for all chunks, which will often but not always be true. I am not currently sure how to calculate this cheaply.

@nvogtvincent
Copy link
Contributor Author

PR #1095 solves a problem remaining in this PR (not knowing the number of unique timesteps, which is solved with the new time_steps variable in #1095).

@erikvansebille
Copy link
Member

Closing this PR, as the conversion from the npy dumps to NetCDF will not be necessary anymore once we save directly to zarr format (#1199).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants