🍾 Xarray is now 10 years old! 🎉

xarray.DataArray

xarray.DataArray#

class xarray.DataArray(data=<NA>, coords=None, dims=None, name=None, attrs=None, indexes=None, fastpath=False)[source]#

N-dimensional array with labeled coordinates and dimensions.

DataArray provides a wrapper around numpy ndarrays that uses labeled dimensions and coordinates to support metadata aware operations. The API is similar to that for the pandas Series or DataFrame, but DataArray objects can have any number of dimensions, and their contents have fixed data types.

Additional features over raw numpy arrays:

  • Apply operations over dimensions by name: x.sum('time').

  • Select or assign values by integer location (like numpy): x[:10] or by label (like pandas): x.loc['2014-01-01'] or x.sel(time='2014-01-01').

  • Mathematical operations (e.g., x - y) vectorize across multiple dimensions (known in numpy as “broadcasting”) based on dimension names, regardless of their original order.

  • Keep track of arbitrary metadata in the form of a Python dictionary: x.attrs

  • Convert to a pandas Series: x.to_series().

Getting items from or doing mathematical operations with a DataArray always returns another DataArray.

Parameters:
  • data (array_like) – Values for this array. Must be an numpy.ndarray, ndarray like, or castable to an ndarray. If a self-described xarray or pandas object, attempts are made to use this array’s metadata to fill in other unspecified arguments. A view of the array’s data is used instead of a copy if possible.

  • coords (sequence or dict of array_like or Coordinates, optional) – Coordinates (tick labels) to use for indexing along each dimension. The following notations are accepted:

    • mapping {dimension name: array-like}

    • sequence of tuples that are valid arguments for xarray.Variable() - (dims, data) - (dims, data, attrs) - (dims, data, attrs, encoding)

    Additionally, it is possible to define a coord whose name does not match the dimension name, or a coord based on multiple dimensions, with one of the following notations:

    • mapping {coord name: DataArray}

    • mapping {coord name: Variable}

    • mapping {coord name: (dimension name, array-like)}

    • mapping {coord name: (tuple of dimension names, array-like)}

    Alternatively, a Coordinates object may be used in order to explicitly pass indexes (e.g., a multi-index or any custom Xarray index) or to bypass the creation of a default index for any Dimension coordinate included in that object.

  • dims (Hashable or sequence of Hashable, optional) – Name(s) of the data dimension(s). Must be either a Hashable (only for 1D data) or a sequence of Hashables with length equal to the number of dimensions. If this argument is omitted, dimension names are taken from coords (if possible) and otherwise default to ['dim_0', ... 'dim_n'].

  • name (str or None, optional) – Name of this array.

  • attrs (dict-like or None, optional) – Attributes to assign to the new instance. By default, an empty attribute dictionary is initialized.

  • indexes (py:class:`~xarray.Indexes` or dict-like, optional) – For internal use only. For passing indexes objects to the new DataArray, use the coords argument instead with a Coordinate object (both coordinate variables and indexes will be extracted from the latter).

Examples

Create data:

>>> np.random.seed(0)
>>> temperature = 15 + 8 * np.random.randn(2, 2, 3)
>>> lon = [[-99.83, -99.32], [-99.79, -99.23]]
>>> lat = [[42.25, 42.21], [42.63, 42.59]]
>>> time = pd.date_range("2014-09-06", periods=3)
>>> reference_time = pd.Timestamp("2014-09-05")

Initialize a dataarray with multiple dimensions:

>>> da = xr.DataArray(
...     data=temperature,
...     dims=["x", "y", "time"],
...     coords=dict(
...         lon=(["x", "y"], lon),
...         lat=(["x", "y"], lat),
...         time=time,
...         reference_time=reference_time,
...     ),
...     attrs=dict(
...         description="Ambient temperature.",
...         units="degC",
...     ),
... )
>>> da
<xarray.DataArray (x: 2, y: 2, time: 3)> Size: 96B
array([[[29.11241877, 18.20125767, 22.82990387],
        [32.92714559, 29.94046392,  7.18177696]],

       [[22.60070734, 13.78914233, 14.17424919],
        [18.28478802, 16.15234857, 26.63418806]]])
Coordinates:
    lon             (x, y) float64 32B -99.83 -99.32 -99.79 -99.23
    lat             (x, y) float64 32B 42.25 42.21 42.63 42.59
  * time            (time) datetime64[ns] 24B 2014-09-06 2014-09-07 2014-09-08
    reference_time  datetime64[ns] 8B 2014-09-05
Dimensions without coordinates: x, y
Attributes:
    description:  Ambient temperature.
    units:        degC

Find out where the coldest temperature was:

>>> da.isel(da.argmin(...))
<xarray.DataArray ()> Size: 8B
array(7.18177696)
Coordinates:
    lon             float64 8B -99.32
    lat             float64 8B 42.21
    time            datetime64[ns] 8B 2014-09-08
    reference_time  datetime64[ns] 8B 2014-09-05
Attributes:
    description:  Ambient temperature.
    units:        degC
__init__(data=<NA>, coords=None, dims=None, name=None, attrs=None, indexes=None, fastpath=False)[source]#

Methods

__init__([data, coords, dims, name, attrs, ...])

all([dim, keep_attrs])

Reduce this DataArray's data by applying all along some dimension(s).

any([dim, keep_attrs])

Reduce this DataArray's data by applying any along some dimension(s).

argmax([dim, axis, keep_attrs, skipna])

Index or indices of the maximum of the DataArray over one or more dimensions.

argmin([dim, axis, keep_attrs, skipna])

Index or indices of the minimum of the DataArray over one or more dimensions.

argsort([axis, kind, order])

Returns the indices that would sort this array.

as_numpy()

Coerces wrapped data and coordinates into numpy arrays, returning a DataArray.

assign_attrs(*args, **kwargs)

Assign new attrs to this object.

assign_coords([coords])

Assign new coordinates to this object.

astype(dtype, *[, order, casting, subok, ...])

Copy of the xarray object, with data cast to a specified type.

bfill(dim[, limit])

Fill NaN values by propagating values backward

broadcast_equals(other)

Two DataArrays are broadcast equal if they are equal after broadcasting them against each other such that they have the same dimensions.

broadcast_like(other, *[, exclude])

Broadcast this DataArray against another Dataset or DataArray.

chunk([chunks, name_prefix, token, lock, ...])

Coerce this array's data into a dask arrays with the given chunks.

clip([min, max, keep_attrs])

Return an array whose values are limited to [min, max].

close()

Release any resources linked to this object.

coarsen([dim, boundary, side, coord_func])

Coarsen object for DataArrays.

combine_first(other)

Combine two DataArray objects, with union of coordinates.

compute(**kwargs)

Manually trigger loading of this array's data from disk or a remote source into memory and return a new array.

conj()

Complex-conjugate all elements.

conjugate()

Return the complex conjugate, element-wise.

convert_calendar(calendar[, dim, align_on, ...])

Convert the DataArray to another calendar.

copy([deep, data])

Returns a copy of this array.

count([dim, keep_attrs])

Reduce this DataArray's data by applying count along some dimension(s).

cumprod([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying cumprod along some dimension(s).

cumsum([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying cumsum along some dimension(s).

cumulative(dim[, min_periods])

Accumulating object for DataArrays.

cumulative_integrate([coord, datetime_unit])

Integrate cumulatively along the given coordinate using the trapezoidal rule.

curvefit(coords, func[, reduce_dims, ...])

Curve fitting optimization for arbitrary functions.

diff(dim[, n, label])

Calculate the n-th order discrete difference along given axis.

differentiate(coord[, edge_order, datetime_unit])

Differentiate the array with the second order accurate central differences.

dot(other[, dim])

Perform dot product of two DataArrays along their shared dims.

drop([labels, dim, errors])

Backward compatible method based on drop_vars and drop_sel

drop_duplicates(dim, *[, keep])

Returns a new DataArray with duplicate dimension values removed.

drop_encoding()

Return a new DataArray without encoding on the array or any attached coords.

drop_indexes(coord_names, *[, errors])

Drop the indexes assigned to the given coordinates.

drop_isel([indexers])

Drop index positions from this DataArray.

drop_sel([labels, errors])

Drop index labels from this DataArray.

drop_vars(names, *[, errors])

Returns an array with dropped variables.

dropna(dim, *[, how, thresh])

Returns a new array with dropped labels for missing values along the provided dimension.

equals(other)

True if two DataArrays have the same dimensions, coordinates and values; otherwise False.

expand_dims([dim, axis])

Return a new object with an additional axis (or axes) inserted at the corresponding position in the array shape.

ffill(dim[, limit])

Fill NaN values by propagating values forward

fillna(value)

Fill missing values in this object.

from_dict(d)

Convert a dictionary into an xarray.DataArray

from_iris(cube)

Convert a iris.cube.Cube into an xarray.DataArray

from_series(series[, sparse])

Convert a pandas.Series into an xarray.DataArray.

get_axis_num(dim)

Return axis number(s) corresponding to dimension(s) in this array.

get_index(key)

Get an index for a dimension, with fall-back to a default RangeIndex

groupby(group[, squeeze, restore_coord_dims])

Returns a DataArrayGroupBy object for performing grouped operations.

groupby_bins(group, bins[, right, labels, ...])

Returns a DataArrayGroupBy object for performing grouped operations.

head([indexers])

Return a new DataArray whose data is given by the the first n values along the specified dimension(s).

identical(other)

Like equals, but also checks the array name and attributes, and attributes on all coordinates.

idxmax([dim, skipna, fill_value, keep_attrs])

Return the coordinate label of the maximum value along a dimension.

idxmin([dim, skipna, fill_value, keep_attrs])

Return the coordinate label of the minimum value along a dimension.

integrate([coord, datetime_unit])

Integrate along the given coordinate using the trapezoidal rule.

interp([coords, method, assume_sorted, kwargs])

Interpolate a DataArray onto new coordinates

interp_calendar(target[, dim])

Interpolates the DataArray to another calendar based on decimal year measure.

interp_like(other[, method, assume_sorted, ...])

Interpolate this object onto the coordinates of another object, filling out of range values with NaN.

interpolate_na([dim, method, limit, ...])

Fill in NaNs by interpolating according to different methods.

isel([indexers, drop, missing_dims])

Return a new DataArray whose data is given by selecting indexes along the specified dimension(s).

isin(test_elements)

Tests each value in the array for whether it is in test elements.

isnull([keep_attrs])

Test each value in the array for whether it is a missing value.

item(*args)

Copy an element of an array to a standard Python scalar and return it.

load(**kwargs)

Manually trigger loading of this array's data from disk or a remote source into memory and return this array.

map_blocks(func[, args, kwargs, template])

Apply a function to each block of this DataArray.

max([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying max along some dimension(s).

mean([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying mean along some dimension(s).

median([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying median along some dimension(s).

min([dim, skipna, keep_attrs])

Reduce this DataArray's data by applying min along some dimension(s).

notnull([keep_attrs])

Test each value in the array for whether it is not a missing value.

pad([pad_width, mode, stat_length, ...])

Pad this array along one or more dimensions.

persist(**kwargs)

Trigger computation in constituent dask arrays

pipe(func, *args, **kwargs)

Apply func(self, *args, **kwargs)

polyfit(dim, deg[, skipna, rcond, w, full, cov])

Least squares polynomial fit.

prod([dim, skipna, min_count, keep_attrs])

Reduce this DataArray's data by applying prod along some dimension(s).

quantile(q[, dim, method, keep_attrs, ...])

Compute the qth quantile of the data along the specified dimension.

query([queries, parser, engine, missing_dims])

Return a new data array indexed along the specified dimension(s), where the indexers are given as strings containing Python expressions to be evaluated against the values in the array.

rank(dim, *[, pct, keep_attrs])

Ranks the data.

reduce(func[, dim, axis, keep_attrs, keepdims])

Reduce this array by applying func along some dimension(s).

reindex([indexers, method, tolerance, copy, ...])

Conform this object onto the indexes of another object, filling in missing values with fill_value.

reindex_like(other, *[, method, tolerance, ...])

Conform this object onto the indexes of another object, for indexes which the objects share.

rename([new_name_or_name_dict])

Returns a new DataArray with renamed coordinates, dimensions or a new name.

reorder_levels([dim_order])

Rearrange index levels using input order.

resample([indexer, skipna, closed, label, ...])

Returns a Resample object for performing resampling operations.

reset_coords([names, drop])

Given names of coordinates, reset them to become variables.

reset_encoding()

reset_index(dims_or_levels[, drop])

Reset the specified index(es) or multi-index level(s).

roll([shifts, roll_coords])

Roll this array by an offset along one or more dimensions.

rolling([dim, min_periods, center])

Rolling window object for DataArrays.

rolling_exp([window, window_type])

Exponentially-weighted moving window.

round(*args, **kwargs)

Round an array to the given number of decimals.

searchsorted(v[, side, sorter])

Find indices where elements of v should be inserted in a to maintain order.

sel([indexers, method, tolerance, drop])

Return a new DataArray whose data is given by selecting index labels along the specified dimension(s).

set_close(close)

Register the function that releases any resources linked to this object.

set_index([indexes, append])

Set DataArray (multi-)indexes using one or more existing coordinates.

set_xindex(coord_names[, index_cls])

Set a new, Xarray-compatible index from one or more existing coordinate(s).

shift([shifts, fill_value])

Shift this DataArray by an offset along one or more dimensions.

sortby(variables[, ascending])

Sort object by labels or values (along an axis).

squeeze([dim, drop, axis])

Return a new object with squeezed data.

stack([dimensions, create_index, index_cls])

Stack any number of existing dimensions into a single new dimension.

std([dim, skipna, ddof, keep_attrs])

Reduce this DataArray's data by applying std along some dimension(s).

sum([dim, skipna, min_count, keep_attrs])

Reduce this DataArray's data by applying sum along some dimension(s).

swap_dims([dims_dict])

Returns a new DataArray with swapped dimensions.

tail([indexers])

Return a new DataArray whose data is given by the the last n values along the specified dimension(s).

thin([indexers])

Return a new DataArray whose data is given by each n value along the specified dimension(s).

to_dask_dataframe([dim_order, set_index])

Convert this array into a dask.dataframe.DataFrame.

to_dataframe([name, dim_order])

Convert this array and its coordinates into a tidy pandas.DataFrame.

to_dataset([dim, name, promote_attrs])

Convert a DataArray to a Dataset.

to_dict([data, encoding])

Convert this xarray.DataArray into a dictionary following xarray naming conventions.

to_index()

Convert this variable to a pandas.Index.

to_iris()

Convert this array into a iris.cube.Cube

to_masked_array([copy])

Convert this array into a numpy.ma.MaskedArray

to_netcdf([path, mode, format, group, ...])

Write DataArray contents to a netCDF file.

to_numpy()

Coerces wrapped data to numpy and returns a numpy.ndarray.

to_pandas()

Convert this array into a pandas object with the same shape.

to_series()

Convert this array into a pandas.Series.

to_unstacked_dataset(dim[, level])

Unstack DataArray expanding to Dataset along a given level of a stacked coordinate.

to_zarr([store, chunk_store, mode, ...])

Write DataArray contents to a Zarr store

transpose(*dims[, transpose_coords, ...])

Return a new DataArray object with transposed dimensions.

unify_chunks()

Unify chunk size along all chunked dimensions of this DataArray.

unstack([dim, fill_value, sparse])

Unstack existing dimensions corresponding to MultiIndexes into multiple new dimensions.

var([dim, skipna, ddof, keep_attrs])

Reduce this DataArray's data by applying var along some dimension(s).

weighted(weights)

Weighted DataArray operations.

where(cond[, other, drop])

Filter elements from this object according to a condition.

Attributes

T

attrs

Dictionary storing arbitrary metadata with this array.

chunks

Tuple of block lengths for this dataarray's data, in order of dimensions, or None if the underlying data is not a dask array.

chunksizes

Mapping from dimension names to block lengths for this dataarray's data, or None if the underlying data is not a dask array.

coords

Mapping of DataArray objects corresponding to coordinate variables.

data

The DataArray's data as an array.

dims

Tuple of dimension names associated with this array.

dt

alias of CombinedDatetimelikeAccessor[DataArray]

dtype

Data-type of the array’s elements.

encoding

Dictionary of format-specific settings for how this array should be serialized.

imag

The imaginary part of the array.

indexes

Mapping of pandas.Index objects used for label based indexing.

loc

Attribute for location based indexing like pandas.

name

The name of this array.

nbytes

Total bytes consumed by the elements of this DataArray's data.

ndim

Number of array dimensions.

real

The real part of the array.

shape

Tuple of array dimensions.

size

Number of elements in the array.

sizes

Ordered mapping from dimension names to lengths.

str

alias of StringAccessor[DataArray]

values

The array's data as a numpy.ndarray.

variable

Low level interface to the Variable object for this DataArray.

xindexes

Mapping of Index objects used for label based indexing.