Skip to content

Commit

Permalink
Compatibility with dask.dataframe's is_scalar (#18030)
Browse files Browse the repository at this point in the history
Makes the import of dask dataframe's `is_scalar` dependent on the dask version.

Closes #18028

Authors:
  - Tom Augspurger (https://github.com/TomAugspurger)

Approvers:
  - Benjamin Zaitlen (https://github.com/quasiben)
  - Matthew Murray (https://github.com/Matt711)

URL: #18030
  • Loading branch information
TomAugspurger authored Feb 18, 2025
1 parent 81bb6f1 commit 0556701
Showing 1 changed file with 16 additions and 1 deletion.
17 changes: 16 additions & 1 deletion python/dask_cudf/dask_cudf/_expr/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Copyright (c) 2024-2025, NVIDIA CORPORATION.

import importlib.metadata

from packaging.version import Version

import dask
import dask.dataframe.dask_expr._shuffle as _shuffle_module
from dask.dataframe import get_collection_type
Expand Down Expand Up @@ -34,7 +38,6 @@
from dask.dataframe.dask_expr._util import (
_convert_to_list,
_raise_if_object_series,
is_scalar,
)
from dask.dataframe.dask_expr.io.io import (
FusedIO,
Expand All @@ -46,6 +49,18 @@
ReadParquetPyarrowFS,
)

_dask_version = importlib.metadata.version("dask")

# TODO: change ">2025.2.0" to ">={next-version}" when released.
DASK_2025_3_0 = Version(_dask_version) > Version("2025.2.0")


if DASK_2025_3_0:
from dask.dataframe.utils import is_scalar
else:
from dask.dataframe.dask_expr._util import is_scalar


__all__ = [
"CumulativeBlockwise",
"DXDataFrame",
Expand Down

0 comments on commit 0556701

Please sign in to comment.