Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

to_zarr silently loses data when using append_dim, if chunks are different to zarr store #8882

Closed
5 tasks
harryC-space-intelligence opened this issue Mar 27, 2024 · 4 comments · Fixed by #8459
Closed
5 tasks
Labels
bug topic-zarr Related to zarr storage library

Comments

@harryC-space-intelligence

What happened?

When writing a chunked DataArray to an existing zarr store, appending along an existing dimension of the store, I have found that some data are not written if there are multiple array chunks to one zarr chunk.

I appreciate it is probably bad practice to have different chunksizes in my DataArray and zarr_store, but I think its a realistic scenario that needs to be caught.

This may be related to / the same underlying issue as #8371. Perhaps the checks mentioned in #8371 (comment) are somehow getting bypassed? Using zarr's ThreadSynchronizer is the only way I have found to ensure that all the data gets written.

What did you expect to happen?

I expected that either

  • to_zarr would recognise the different chunk sizes, and re-chunk or wait for all the chunks to be written
  • or an error would be raised, given that the results result in loss of data in an unpredictable way

Minimal Complete Verifiable Example

import xarray as xr
import numpy as np
from matplotlib import pyplot as plt

x_coords = np.arange(10)
y_coords = np.arange(10)
t_coords = np.array([np.datetime64('2020-01-01').astype('datetime64[ns]')])
data = np.ones((10,10))

for i in range(4):
    plt.subplot(1,4,i+1)
    
    da = xr.DataArray(data.reshape((-1,10,10)),
                      dims = ['time','x','y'],
                      coords = {'x':x_coords, 'y':y_coords, 'time':t_coords},
                     ).chunk({'x':5, 'y':5,'time':1}).rename('foo')
    
    da.to_zarr('foo.zarr', mode='w')
    
    new_time = np.array([np.datetime64('2021-01-01').astype('datetime64[ns]')])
    
    da2 = xr.DataArray(data.reshape((-1,10,10)),
                      dims = ['time','x','y'],
                      coords = {'x':x_coords, 'y':y_coords, 'time':new_time},
                     ).chunk({'x':1, 'y':1,'time':1}).rename('foo')
    
    da2.to_zarr('foo.zarr',append_dim='time', mode='a')
    
    plt.imshow(xr.open_zarr('foo.zarr').isel(time=-1).foo.values)

MVCE confirmation

  • Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • Complete example — the example is self-contained, including all data and the text of any traceback.
  • Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • New issue — a search of GitHub Issues suggests this is not a duplicate.
  • Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

No response

Anything else we need to know?

Output from the plots above:

image

Environment

INSTALLED VERSIONS

commit: None
python: 3.11.4 | packaged by conda-forge | (main, Jun 10 2023, 18:08:17) [GCC 12.2.0]
python-bits: 64
OS: Linux
OS-release: 5.15.0-1041-azure
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: C.UTF-8
LANG: C.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.14.3
libnetcdf: 4.9.2

xarray: 2024.2.0
pandas: 2.2.1
numpy: 1.26.4
scipy: 1.12.0
netCDF4: 1.6.5
pydap: installed
h5netcdf: 1.3.0
h5py: 3.10.0
Nio: None
zarr: 2.17.1
cftime: 1.6.3
nc_time_axis: 1.4.1
iris: None
bottleneck: 1.3.8
dask: 2024.3.1
distributed: 2024.3.1
matplotlib: 3.8.3
cartopy: 0.22.0
seaborn: 0.13.2
numbagg: None
fsspec: 2024.3.1
cupy: None
pint: 0.23
sparse: 0.15.1
flox: 0.9.5
numpy_groupies: 0.10.2
setuptools: 69.2.0
pip: 24.0
conda: 24.1.2
pytest: 8.1.1
mypy: None
IPython: 8.22.2
sphinx: None

@harryC-space-intelligence harryC-space-intelligence added bug needs triage Issue that has not been reviewed by xarray team member labels Mar 27, 2024
Copy link

welcome bot commented Mar 27, 2024

Thanks for opening your first issue here at xarray! Be sure to follow the issue template!
If you have an idea for a solution, we would really welcome a Pull Request with proposed changes.
See the Contributing Guide for more.
It may take us a while to respond here, but we really value your contribution. Contributors like you help make xarray better.
Thank you!

@dcherian dcherian added topic-zarr Related to zarr storage library and removed needs triage Issue that has not been reviewed by xarray team member labels Mar 27, 2024
@rsemlal-murmuration
Copy link

Oh this seems to be the same problem as the one I raised 2 days ago in this issue? : #8876

@harryC-space-intelligence
Copy link
Author

harryC-space-intelligence commented Mar 28, 2024

Oh this seems to be the same problem as the one I raised 2 days ago in this issue? : #8876

Thanks, sorry I don't understand enough about what's happening under the hood to know if it's exactly the same problem but the example is very similar - I hadn't realised that it could occur with chunking dimension being the same as the one given to append_dim.

@rsemlal-murmuration
Copy link

Yeah, in my example, I didn't consider testing what happens with the other dimensions either. I'm not sure if this could also be a race condition in your scenario (an indication would be if the issue is systematic or occurs randomly).

However, it seems that both issues come from the same root cause: the improper handling of chunking misalignment by to_zarr(mode='a', append_dim=...).

It might be the case that the solution chosen to address this issue might also resolve the other one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug topic-zarr Related to zarr storage library
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants