Skip to content

Improve UX/documentation for loading data in cloud storage #6432

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
gjoseph92 opened this issue Mar 31, 2022 · 0 comments
Open

Improve UX/documentation for loading data in cloud storage #6432

gjoseph92 opened this issue Mar 31, 2022 · 0 comments

Comments

@gjoseph92
Copy link

What is your issue?

I recently tried to use xarray to open some netCDF files stored in a bucket, and was surprised how hard it was to figure out the right incantation to make this work.

The fact that passing an fsspec URL (like "s3://bucket/path/data.zarr") to open_dataset "just works" for zarr is a little misleading, since it makes you think you could do something similar for other types of files. However, this doesn't work for netCDF, GRIB, and I assume most others.

However, h5netcdf does work if you pass an fsspec file-like object (not sure if other engines support this as well?). But to add to the confusion, you can't pass the fsspec.OpenFile you get from fsspec.open; you have to pass a concrete type like S3File, GCSFile, etc:

>>> import xarray as xr
>>> import fsspec

>>> url = "s3://noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp"  # a netCDF file in s3

You can't use the URL as a string directly:

>>> xr.open_dataset(url, engine='h5netcdf')
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
...
FileNotFoundError: [Errno 2] Unable to open file (unable to open file: name = 's3://noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

Ok, what about fsspec.open?

>>> f = fsspec.open(url)
... f
<OpenFile 'noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp'>
>>> xr.open_dataset(f, engine='h5netcdf')
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
...

File ~/miniconda3/envs/xarray-buckets/lib/python3.10/site-packages/xarray/backends/common.py:23, in _normalize_path(path)
     21 def _normalize_path(path):
     22     if isinstance(path, os.PathLike):
---> 23         path = os.fspath(path)
     25     if isinstance(path, str) and not is_remote_uri(path):
     26         path = os.path.abspath(os.path.expanduser(path))

File ~/miniconda3/envs/xarray-buckets/lib/python3.10/site-packages/fsspec/core.py:98, in OpenFile.__fspath__(self)
     96 def __fspath__(self):
     97     # may raise if cannot be resolved to local file
---> 98     return self.open().__fspath__()

AttributeError: 'S3File' object has no attribute '__fspath__'

But if you somehow know that an fsspec.OpenFile isn't actually a file-like object, and you double-open it, then it works! (xref #5879 (comment))

>>> s3f = f.open()
... s3f
<File-like object S3FileSystem, noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp>
>>> xr.open_dataset(s3f, engine='h5netcdf')
<xarray.Dataset>
Dimensions:         (time: 1, reference_time: 1, feature_id: 2776738)
Coordinates:
  * time            (time) datetime64[ns] 1979-02-01T01:00:00
  * reference_time  (reference_time) datetime64[ns] 1979-02-01
  * feature_id      (feature_id) int32 101 179 181 ... 1180001803 1180001804
    latitude        (feature_id) float32 ...
    longitude       (feature_id) float32 ...
...

(And even then, you have to know to use the h5netcdf engine, and not netcdf4 or scipy.)


Some things that might be nice:

  1. Explicit documentation on working with data in cloud storage, perhaps broken down by file type/engine (xref improve docs on zarr + cloud storage #2712). It might be nice to have a table/quick reference of which engines support reading from cloud storage, and how to pass in the URL (string? fsspec file object?)
  2. Informative error linking to these docs when opening fails and is_remote_uri(filename_or_obj)
  3. Either make fsspec.OpenFile objects work, so you don't have to do the double-open, or raise an informative error when one is passed in telling you what to do instead.

As more and more data is available on cloud storage, newcomers to xarray will probably be increasingly looking to use it with remote data. Since xarray already supports this in some cases, this is great! With a few tweaks to docs and error messages, I think we could change an experience that took me multiple hours of debugging and reading the source into an easy 30sec experience for new users.

cc @martindurant @phobson

@gjoseph92 gjoseph92 added the needs triage Issue that has not been reviewed by xarray team member label Mar 31, 2022
@mathause mathause added topic-backends topic-documentation and removed needs triage Issue that has not been reviewed by xarray team member labels Apr 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants