-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Appending data to a dataset stored in Zarr format produce PermissonError or NaN values in the final result #5511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for the report! This does look broken, which I was able to verify by running your code. My guess is that something in Xarray's logic for appending datasets implicitly assumes that the existing datasets has been written in complete "chunks", which is not the case here. |
Hi again, I check a little bit more the behavior of Zarr and Dask and I found that the problem only occurs when the lock option in the 'dask.store' method is set as None or False, below you can find an example: import numpy as np
import zarr
import dask.array as da
# Writing an small zarr array with 42.2 as the value
z1 = zarr.open('data/example.zarr', mode='w', shape=(152), chunks=(30), dtype='f4')
z1[:] = 42.2
# resizing the array
z2 = zarr.open('data/example.zarr', mode='a')
z2.resize(308)
# New data to append
append_data = da.from_array(np.array([50.3] * 156), chunks=(30))
# If you pass to the lock parameters None or False you will get the PermissonError or some 0s in the final result
# so I think this is the problem when Xarray writes to Zarr with Dask, (I saw in the code that by default use lock = None)
# If you put lock = True all the problems disappear.
da.store(append_data, z2, regions=[tuple([slice(152, 308)])], lock=None)
# the result can contain many 0s or throw an error
print(z2[:]) Hope this help to fix the bug. |
Hi @shoyer, sorry for bothering you with this issue again, I know that it is old right now, but I have been dealing with it again some days ago and I have also noticed the same problem using the region parameter, so I was thinking that based on this issue I opened on Zarr (zarr-developers/zarr-python#1414) it would be good to implement any of this options to solve the problem:
|
If we can find cases where we know concurrent writes are unsafe, we can definitely start raising errors. Better to be safe than to suffer from silent data corruption! |
What happened:
I was trying to append new data to an existing Zarr file with a time-series dataset (a financial index) and I start to notice that sometimes it produce PermissonError or randomly appear some NaN, so I check and the problem looks like is something related to multiple threads/process trying to write the same chunk (probably the lasts that has different size).
What you expected to happen:
I would like to be able to store the data perfectly or it should be sufficient if it raise a NotImplemented error in case that this kind of appends is incorrect
Minimal Complete Verifiable Example:
Probably you have to run many times this code to reproduce the errors, basically, you will see the PermissonError or an increment in the number of NaNs (it should has always 0)
Anything else we need to know?:
Environment:
Output of xr.show_versions()
INSTALLED VERSIONS
commit: None
python: 3.8.5 (default, Sep 3 2020, 21:29:08) [MSC v.1916 64 bit (AMD64)]
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: Intel64 Family 6 Model 165 Stepping 2, GenuineIntel
byteorder: little
LC_ALL: None
LANG: None
LOCALE: ('es_ES', 'cp1252')
libhdf5: 1.10.4
libnetcdf: None
xarray: 0.18.2
pandas: 1.2.4
numpy: 1.20.2
scipy: 1.6.2
netCDF4: None
pydap: None
h5netcdf: None
h5py: 2.10.0
Nio: None
zarr: 2.8.3
cftime: 1.5.0
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: 1.3.2
dask: 2021.06.0
distributed: 2021.06.1
matplotlib: 3.3.4
cartopy: None
seaborn: 0.11.1
numbagg: None
pint: None
setuptools: 52.0.0.post20210125
pip: 21.1.2
conda: 4.10.1
pytest: 6.2.4
IPython: 7.22.0
sphinx: 4.0.1
The text was updated successfully, but these errors were encountered: