Skip to content

Commit 21c4687

Browse files
authored
Merge branch 'main' into to_dataset-silently-drops
2 parents f14ad1a + 0c1ad54 commit 21c4687

28 files changed

+144
-128
lines changed

.binder/environment.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ name: xarray-examples
22
channels:
33
- conda-forge
44
dependencies:
5-
- python=3.9
5+
- python=3.10
66
- boto3
77
- bottleneck
88
- cartopy
@@ -25,7 +25,7 @@ dependencies:
2525
- numpy
2626
- packaging
2727
- pandas
28-
- pint
28+
- pint>=0.22
2929
- pip
3030
- pooch
3131
- pydap

ci/requirements/all-but-dask.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ dependencies:
2626
- numpy
2727
- packaging
2828
- pandas
29-
- pint<0.21
29+
- pint>=0.22
3030
- pip
3131
- pseudonetcdf
3232
- pydap

ci/requirements/environment-py311.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ dependencies:
2828
- numpy
2929
- packaging
3030
- pandas
31-
- pint<0.21
31+
- pint>=0.22
3232
- pip
3333
- pooch
3434
- pre-commit

ci/requirements/environment-windows-py311.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ dependencies:
2525
- numpy
2626
- packaging
2727
- pandas
28-
- pint<0.21
28+
- pint>=0.22
2929
- pip
3030
- pre-commit
3131
- pseudonetcdf

ci/requirements/environment-windows.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ dependencies:
2525
- numpy
2626
- packaging
2727
- pandas
28-
- pint<0.21
28+
- pint>=0.22
2929
- pip
3030
- pre-commit
3131
- pseudonetcdf

ci/requirements/environment.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ dependencies:
2929
- opt_einsum
3030
- packaging
3131
- pandas
32-
- pint<0.21
32+
- pint>=0.22
3333
- pip
3434
- pooch
3535
- pre-commit

ci/requirements/min-all-deps.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ dependencies:
3535
- numpy=1.22
3636
- packaging=21.3
3737
- pandas=1.4
38-
- pint=0.19
38+
- pint=0.22
3939
- pip
4040
- pseudonetcdf=3.2
4141
- pydap=3.3

doc/api.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -557,6 +557,7 @@ Datetimelike properties
557557
DataArray.dt.seconds
558558
DataArray.dt.microseconds
559559
DataArray.dt.nanoseconds
560+
DataArray.dt.total_seconds
560561

561562
**Timedelta methods**:
562563

@@ -602,7 +603,7 @@ Dataset methods
602603
Dataset.as_numpy
603604
Dataset.from_dataframe
604605
Dataset.from_dict
605-
Dataset.to_array
606+
Dataset.to_dataarray
606607
Dataset.to_dataframe
607608
Dataset.to_dask_dataframe
608609
Dataset.to_dict

doc/howdoi.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ How do I ...
3636
* - rename a variable, dimension or coordinate
3737
- :py:meth:`Dataset.rename`, :py:meth:`DataArray.rename`, :py:meth:`Dataset.rename_vars`, :py:meth:`Dataset.rename_dims`,
3838
* - convert a DataArray to Dataset or vice versa
39-
- :py:meth:`DataArray.to_dataset`, :py:meth:`Dataset.to_array`, :py:meth:`Dataset.to_stacked_array`, :py:meth:`DataArray.to_unstacked_dataset`
39+
- :py:meth:`DataArray.to_dataset`, :py:meth:`Dataset.to_dataarray`, :py:meth:`Dataset.to_stacked_array`, :py:meth:`DataArray.to_unstacked_dataset`
4040
* - extract variables that have certain attributes
4141
- :py:meth:`Dataset.filter_by_attrs`
4242
* - extract the underlying array (e.g. NumPy or Dask arrays)

doc/user-guide/reshaping.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -59,11 +59,11 @@ use :py:meth:`~xarray.DataArray.squeeze`
5959
Converting between datasets and arrays
6060
--------------------------------------
6161

62-
To convert from a Dataset to a DataArray, use :py:meth:`~xarray.Dataset.to_array`:
62+
To convert from a Dataset to a DataArray, use :py:meth:`~xarray.Dataset.to_dataarray`:
6363

6464
.. ipython:: python
6565
66-
arr = ds.to_array()
66+
arr = ds.to_dataarray()
6767
arr
6868
6969
This method broadcasts all data variables in the dataset against each other,
@@ -77,7 +77,7 @@ To convert back from a DataArray to a Dataset, use
7777
7878
arr.to_dataset(dim="variable")
7979
80-
The broadcasting behavior of ``to_array`` means that the resulting array
80+
The broadcasting behavior of ``to_dataarray`` means that the resulting array
8181
includes the union of data variable dimensions:
8282

8383
.. ipython:: python
@@ -88,7 +88,7 @@ includes the union of data variable dimensions:
8888
ds2
8989
9090
# the resulting array has 6 elements
91-
ds2.to_array()
91+
ds2.to_dataarray()
9292
9393
Otherwise, the result could not be represented as an orthogonal array.
9494

@@ -161,8 +161,8 @@ arrays as inputs. For datasets with only one variable, we only need ``stack``
161161
and ``unstack``, but combining multiple variables in a
162162
:py:class:`xarray.Dataset` is more complicated. If the variables in the dataset
163163
have matching numbers of dimensions, we can call
164-
:py:meth:`~xarray.Dataset.to_array` and then stack along the the new coordinate.
165-
But :py:meth:`~xarray.Dataset.to_array` will broadcast the dataarrays together,
164+
:py:meth:`~xarray.Dataset.to_dataarray` and then stack along the the new coordinate.
165+
But :py:meth:`~xarray.Dataset.to_dataarray` will broadcast the dataarrays together,
166166
which will effectively tile the lower dimensional variable along the missing
167167
dimensions. The method :py:meth:`xarray.Dataset.to_stacked_array` allows
168168
combining variables of differing dimensions without this wasteful copying while

doc/whats-new.rst

Lines changed: 15 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,10 +24,13 @@ New Features
2424

2525
- Use `opt_einsum <https://optimized-einsum.readthedocs.io/en/stable/>`_ for :py:func:`xarray.dot` by default if installed.
2626
By `Deepak Cherian <https://github.com/dcherian>`_. (:issue:`7764`, :pull:`8373`).
27+
- Add ``DataArray.dt.total_seconds()`` method to match the Pandas API. (:pull:`8435`).
28+
By `Ben Mares <https://github.com/maresb>`_.
2729

2830
Breaking changes
2931
~~~~~~~~~~~~~~~~
3032

33+
- Bump minimum tested pint version to ``>=0.22``. By `Deepak Cherian <https://github.com/dcherian>`_.
3134

3235
Deprecations
3336
~~~~~~~~~~~~
@@ -39,6 +42,15 @@ Deprecations
3942
this was one place in the API where dimension positions were used.
4043
(:pull:`8341`)
4144
By `Maximilian Roos <https://github.com/max-sixty>`_.
45+
- Rename :py:meth:`Dataset.to_array` to :py:meth:`Dataset.to_dataarray` for
46+
consistency with :py:meth:`DataArray.to_dataset` &
47+
:py:func:`open_dataarray` functions. This is a "soft" deprecation — the
48+
existing methods work and don't raise any warnings, given the relatively small
49+
benefits of the change.
50+
By `Maximilian Roos <https://github.com/max-sixty>`_.
51+
- Finally remove ``keep_attrs`` kwarg from :py:meth:`DataArray.resample` and
52+
:py:meth:`Dataset.resample`. These were deprecated a long time ago.
53+
By `Deepak Cherian <https://github.com/dcherian>`_.
4254

4355
Bug fixes
4456
~~~~~~~~~
@@ -6710,7 +6722,7 @@ Backwards incompatible changes
67106722
Enhancements
67116723
~~~~~~~~~~~~
67126724

6713-
- New ``xray.Dataset.to_array`` and enhanced
6725+
- New ``xray.Dataset.to_dataarray`` and enhanced
67146726
``xray.DataArray.to_dataset`` methods make it easy to switch back
67156727
and forth between arrays and datasets:
67166728

@@ -6721,8 +6733,8 @@ Enhancements
67216733
coords={"c": 42},
67226734
attrs={"Conventions": "None"},
67236735
)
6724-
ds.to_array()
6725-
ds.to_array().to_dataset(dim="variable")
6736+
ds.to_dataarray()
6737+
ds.to_dataarray().to_dataset(dim="variable")
67266738
67276739
- New ``xray.Dataset.fillna`` method to fill missing values, modeled
67286740
off the pandas method of the same name:

xarray/core/_typed_ops.py

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,10 @@ def __eq__(self, other: DsCompatible) -> Self: # type:ignore[override]
8383
def __ne__(self, other: DsCompatible) -> Self: # type:ignore[override]
8484
return self._binary_op(other, nputils.array_ne)
8585

86+
# When __eq__ is defined but __hash__ is not, then an object is unhashable,
87+
# and it should be declared as follows:
88+
__hash__: None # type:ignore[assignment]
89+
8690
def __radd__(self, other: DsCompatible) -> Self:
8791
return self._binary_op(other, operator.add, reflexive=True)
8892

@@ -291,6 +295,10 @@ def __eq__(self, other: DaCompatible) -> Self: # type:ignore[override]
291295
def __ne__(self, other: DaCompatible) -> Self: # type:ignore[override]
292296
return self._binary_op(other, nputils.array_ne)
293297

298+
# When __eq__ is defined but __hash__ is not, then an object is unhashable,
299+
# and it should be declared as follows:
300+
__hash__: None # type:ignore[assignment]
301+
294302
def __radd__(self, other: DaCompatible) -> Self:
295303
return self._binary_op(other, operator.add, reflexive=True)
296304

@@ -643,6 +651,10 @@ def __ne__(self, other: VarCompatible) -> Self:
643651
def __ne__(self, other: VarCompatible) -> Self | T_DataArray:
644652
return self._binary_op(other, nputils.array_ne)
645653

654+
# When __eq__ is defined but __hash__ is not, then an object is unhashable,
655+
# and it should be declared as follows:
656+
__hash__: None # type:ignore[assignment]
657+
646658
def __radd__(self, other: VarCompatible) -> Self:
647659
return self._binary_op(other, operator.add, reflexive=True)
648660

@@ -851,6 +863,10 @@ def __eq__(self, other: GroupByCompatible) -> Dataset: # type:ignore[override]
851863
def __ne__(self, other: GroupByCompatible) -> Dataset: # type:ignore[override]
852864
return self._binary_op(other, nputils.array_ne)
853865

866+
# When __eq__ is defined but __hash__ is not, then an object is unhashable,
867+
# and it should be declared as follows:
868+
__hash__: None # type:ignore[assignment]
869+
854870
def __radd__(self, other: GroupByCompatible) -> Dataset:
855871
return self._binary_op(other, operator.add, reflexive=True)
856872

@@ -973,6 +989,10 @@ def __eq__(self, other: T_Xarray) -> T_Xarray: # type:ignore[override]
973989
def __ne__(self, other: T_Xarray) -> T_Xarray: # type:ignore[override]
974990
return self._binary_op(other, nputils.array_ne)
975991

992+
# When __eq__ is defined but __hash__ is not, then an object is unhashable,
993+
# and it should be declared as follows:
994+
__hash__: None # type:ignore[assignment]
995+
976996
def __radd__(self, other: T_Xarray) -> T_Xarray:
977997
return self._binary_op(other, operator.add, reflexive=True)
978998

xarray/core/accessor_dt.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,8 @@ def _access_through_series(values, name):
7474
if name == "season":
7575
months = values_as_series.dt.month.values
7676
field_values = _season_from_months(months)
77+
elif name == "total_seconds":
78+
field_values = values_as_series.dt.total_seconds().values
7779
elif name == "isocalendar":
7880
# special NaT-handling can be removed when
7981
# https://github.com/pandas-dev/pandas/issues/54657 is resolved
@@ -574,6 +576,13 @@ class TimedeltaAccessor(TimeAccessor[T_DataArray]):
574576
43200, 64800])
575577
Coordinates:
576578
* time (time) timedelta64[ns] 1 days 00:00:00 ... 5 days 18:00:00
579+
>>> ts.dt.total_seconds()
580+
<xarray.DataArray 'total_seconds' (time: 20)>
581+
array([ 86400., 108000., 129600., 151200., 172800., 194400., 216000.,
582+
237600., 259200., 280800., 302400., 324000., 345600., 367200.,
583+
388800., 410400., 432000., 453600., 475200., 496800.])
584+
Coordinates:
585+
* time (time) timedelta64[ns] 1 days 00:00:00 ... 5 days 18:00:00
577586
"""
578587

579588
@property
@@ -596,6 +605,11 @@ def nanoseconds(self) -> T_DataArray:
596605
"""Number of nanoseconds (>= 0 and less than 1 microsecond) for each element"""
597606
return self._date_field("nanoseconds", np.int64)
598607

608+
# Not defined as a property in order to match the Pandas API
609+
def total_seconds(self) -> T_DataArray:
610+
"""Total duration of each element expressed in seconds."""
611+
return self._date_field("total_seconds", np.float64)
612+
599613

600614
class CombinedDatetimelikeAccessor(
601615
DatetimeAccessor[T_DataArray], TimedeltaAccessor[T_DataArray]

xarray/core/common.py

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -860,7 +860,6 @@ def _resample(
860860
base: int | None,
861861
offset: pd.Timedelta | datetime.timedelta | str | None,
862862
origin: str | DatetimeLike,
863-
keep_attrs: bool | None,
864863
loffset: datetime.timedelta | str | None,
865864
restore_coord_dims: bool | None,
866865
**indexer_kwargs: str,
@@ -989,13 +988,6 @@ def _resample(
989988
from xarray.core.pdcompat import _convert_base_to_offset
990989
from xarray.core.resample import RESAMPLE_DIM
991990

992-
if keep_attrs is not None:
993-
warnings.warn(
994-
"Passing ``keep_attrs`` to ``resample`` has no effect and will raise an"
995-
" error in xarray 0.20. Pass ``keep_attrs`` directly to the applied"
996-
" function, e.g. ``resample(...).mean(keep_attrs=True)``."
997-
)
998-
999991
# note: the second argument (now 'skipna') use to be 'dim'
1000992
if (
1001993
(skipna is not None and not isinstance(skipna, bool))
@@ -1173,7 +1165,7 @@ def _dataset_indexer(dim: Hashable) -> DataArray:
11731165
var for var in cond if dim not in cond[var].dims
11741166
)
11751167
keepany = cond_wdim.any(dim=(d for d in cond.dims.keys() if d != dim))
1176-
return keepany.to_array().any("variable")
1168+
return keepany.to_dataarray().any("variable")
11771169

11781170
_get_indexer = (
11791171
_dataarray_indexer if isinstance(cond, DataArray) else _dataset_indexer

xarray/core/computation.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1603,7 +1603,9 @@ def cross(
16031603
>>> ds_a = xr.Dataset(dict(x=("dim_0", [1]), y=("dim_0", [2]), z=("dim_0", [3])))
16041604
>>> ds_b = xr.Dataset(dict(x=("dim_0", [4]), y=("dim_0", [5]), z=("dim_0", [6])))
16051605
>>> c = xr.cross(
1606-
... ds_a.to_array("cartesian"), ds_b.to_array("cartesian"), dim="cartesian"
1606+
... ds_a.to_dataarray("cartesian"),
1607+
... ds_b.to_dataarray("cartesian"),
1608+
... dim="cartesian",
16071609
... )
16081610
>>> c.to_dataset(dim="cartesian")
16091611
<xarray.Dataset>

xarray/core/dataarray.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7034,7 +7034,6 @@ def resample(
70347034
base: int | None = None,
70357035
offset: pd.Timedelta | datetime.timedelta | str | None = None,
70367036
origin: str | DatetimeLike = "start_day",
7037-
keep_attrs: bool | None = None,
70387037
loffset: datetime.timedelta | str | None = None,
70397038
restore_coord_dims: bool | None = None,
70407039
**indexer_kwargs: str,
@@ -7156,7 +7155,6 @@ def resample(
71567155
base=base,
71577156
offset=offset,
71587157
origin=origin,
7159-
keep_attrs=keep_attrs,
71607158
loffset=loffset,
71617159
restore_coord_dims=restore_coord_dims,
71627160
**indexer_kwargs,

0 commit comments

Comments
 (0)