diff --git a/CHANGELOG.md b/CHANGELOG.md
index f29fd5988f..93025041e6 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,4 +1,47 @@
+
+# 0.19.3 (2023-08-10)
+
+## 🐛 Bug Fixes
+
+- Type annotate get_status_dict and note that we can pass Exception or CapturedException which is not subclass. [PR #7403](https://github.com/datalad/datalad/pull/7403) (by [@yarikoptic](https://github.com/yarikoptic))
+
+- BF: create-sibling-gitlab used to raise a TypeError when attempting a recursive operation in a dataset with uninstalled subdatasets. It now raises an impossible result instead. [PR #7430](https://github.com/datalad/datalad/pull/7430) (by [@adswa](https://github.com/adswa))
+
+- Pass branch option into recursive call within Install - for the cases whenever install is invoked with URL(s). Fixes [#7461](https://github.com/datalad/datalad/issues/7461) via [PR #7463](https://github.com/datalad/datalad/pull/7463) (by [@yarikoptic](https://github.com/yarikoptic))
+
+- Allow for reckless=ephemeral clone using relative path for the original location. Fixes [#7469](https://github.com/datalad/datalad/issues/7469) via [PR #7472](https://github.com/datalad/datalad/pull/7472) (by [@yarikoptic](https://github.com/yarikoptic))
+
+## 📝 Documentation
+
+- Fix a property name and default costs described in "getting subdatasets" section of `get` documentation.
+ Fixes [#7458](https://github.com/datalad/datalad/issues/7458) via
+ [PR #7460](https://github.com/datalad/datalad/pull/7460)
+ (by [@mslw](https://github.com/mslw))
+
+## 🏠 Internal
+
+- Copy an adjusted environment only if requested to do so.
+ [PR #7399](https://github.com/datalad/datalad/pull/7399)
+ (by [@christian-monch](https://github.com/christian-monch))
+
+- Eliminate uses of `pkg_resources`. Fixes [#7435](https://github.com/datalad/datalad/issues/7435) via [PR #7439](https://github.com/datalad/datalad/pull/7439) (by [@jwodder](https://github.com/jwodder))
+
+## 🧪 Tests
+
+- Disable some S3 tests of their VCR taping where they fail for known issues. [PR #7467](https://github.com/datalad/datalad/pull/7467) (by [@yarikoptic](https://github.com/yarikoptic))
+
+
+# 0.19.2 (2023-07-03)
+
+## 🐛 Bug Fixes
+
+- Remove surrounding quotes in output filenames even for newer version of annex. Fixes [#7440](https://github.com/datalad/datalad/issues/7440) via [PR #7443](https://github.com/datalad/datalad/pull/7443) (by [@yarikoptic](https://github.com/yarikoptic))
+
+## 📝 Documentation
+
+- DOC: clarify description of the "install" interface to reflect its convoluted behavior. [PR #7445](https://github.com/datalad/datalad/pull/7445) (by [@yarikoptic](https://github.com/yarikoptic))
+
# 0.19.1 (2023-06-26)
diff --git a/README.md b/README.md
index 9edef769e6..da4530be30 100644
--- a/README.md
+++ b/README.md
@@ -19,6 +19,7 @@
[![https://www.singularity-hub.org/static/img/hosted-singularity--hub-%23e32929.svg](https://www.singularity-hub.org/static/img/hosted-singularity--hub-%23e32929.svg)](https://singularity-hub.org/collections/667)
[![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](https://github.com/datalad/datalad/blob/master/CODE_OF_CONDUCT.md)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.808846.svg)](https://doi.org/10.5281/zenodo.808846)
+[![RRID](https://img.shields.io/badge/RRID-SCR__003931-blue)](https://identifiers.org/RRID:SCR_003931)
[![All Contributors](https://img.shields.io/badge/all_contributors-49-orange.svg?style=flat-square)](#contributors-)
@@ -139,16 +140,39 @@ contributing to the project.
## Acknowledgements
-DataLad development is supported by a US-German collaboration in computational
-neuroscience (CRCNS) project "DataGit: converging catalogues, warehouses, and
-deployment logistics into a federated 'data distribution'" (Halchenko/Hanke),
-co-funded by the US National Science Foundation (NSF 1429999) and the German
-Federal Ministry of Education and Research (BMBF 01GQ1411). Additional support
-is provided by the German federal state of Saxony-Anhalt and the European
-Regional Development Fund (ERDF), Project: Center for Behavioral Brain
-Sciences, Imaging Platform. This work is further facilitated by the ReproNim
-project (NIH 1P41EB019936-01A1). Mac mini instance for development is provided
-by [MacStadium](https://www.macstadium.com/).
+The DataLad project received support through the following grants:
+
+- US-German collaboration in computational neuroscience (CRCNS) project
+ "DataGit: converging catalogues, warehouses, and deployment logistics into a
+ federated 'data distribution'" (Halchenko/Hanke), co-funded by the US National
+ Science Foundation (NSF 1429999) and the German Federal Ministry of
+ Education and Research (BMBF 01GQ1411).
+
+- CRCNS US-German Data Sharing "DataLad - a decentralized system for integrated
+ discovery, management, and publication of digital objects of science"
+ (Halchenko/Pestilli/Hanke), co-funded by the US National Science Foundation
+ (NSF 1912266) and the German Federal Ministry of Education and Research
+ (BMBF 01GQ1905).
+
+- Helmholtz Research Center Jülich, FDM challenge 2022
+
+- German federal state of Saxony-Anhalt and the European Regional Development
+ Fund (ERDF), Project: Center for Behavioral Brain Sciences, Imaging Platform
+
+- ReproNim project (NIH 1P41EB019936-01A1).
+
+- Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under grant
+ SFB 1451 ([431549029](https://gepris.dfg.de/gepris/projekt/431549029),
+ INF project)
+
+- European Union’s Horizon 2020 research and innovation programme under grant
+ agreements:
+ - [Human Brain Project SGA3 (H2020-EU.3.1.5.3, grant no. 945539)](https://cordis.europa.eu/project/id/945539)
+ - [VirtualBrainCloud (H2020-EU.3.1.5.3, grant no. 826421)](https://cordis.europa.eu/project/id/826421)
+
+Mac mini instance for development is provided by
+[MacStadium](https://www.macstadium.com/).
+
### Contributors ✨
diff --git a/changelog.d/pr-7443.md b/changelog.d/pr-7443.md
deleted file mode 100644
index a930c77224..0000000000
--- a/changelog.d/pr-7443.md
+++ /dev/null
@@ -1,3 +0,0 @@
-### 🐛 Bug Fixes
-
-- Remove surrounding quotes in output filenames even for newer version of annex. Fixes [#7440](https://github.com/datalad/datalad/issues/7440) via [PR #7443](https://github.com/datalad/datalad/pull/7443) (by [@yarikoptic](https://github.com/yarikoptic))
diff --git a/datalad/cli/parser.py b/datalad/cli/parser.py
index 4d78bbb02b..b3c44026ca 100644
--- a/datalad/cli/parser.py
+++ b/datalad/cli/parser.py
@@ -11,29 +11,29 @@
# like error handling, must be done conditionally in-line.
import argparse
+import logging
+import sys
from collections import defaultdict
from functools import partial
-import sys
-
from datalad import __version__
-
-from .common_args import common_args
from datalad.interface.base import (
- is_api_arg,
get_cmd_doc,
get_interface_groups,
+ is_api_arg,
load_interface,
)
+from datalad.support.constraints import EnsureChoice
from datalad.utils import getargspec
+
+from .common_args import common_args
+from .exec import call_from_parser
+from .helpers import get_commands_from_groups
from .interface import (
alter_interface_docs_for_cmdline,
get_cmd_ex,
get_cmdline_command_name,
)
-from datalad.support.constraints import EnsureChoice
-from .helpers import get_commands_from_groups
-from .exec import call_from_parser
# special case imports
# .helpers import add_entrypoints_to_interface_groups
@@ -43,7 +43,6 @@
# .interface._known_extension_commands
# .interface._deprecated_commands
-import logging
lgr = logging.getLogger('datalad.cli.parser')
@@ -138,6 +137,7 @@ def setup_parser(
# we need the full help, or we have a potential command that
# lives in an extension, must load all extension, expensive
from .helpers import add_entrypoints_to_interface_groups
+
# need to load all the extensions and try again
# TODO load extensions one-by-one and stop when a command was found
add_entrypoints_to_interface_groups(interface_groups)
@@ -404,8 +404,8 @@ def try_suggest_extension_with_command(parser, cmd, completing, known_cmds):
"""If completing=False, this function will trigger sys.exit()"""
# check if might be coming from known extensions
from .interface import (
- _known_extension_commands,
_deprecated_commands,
+ _known_extension_commands,
)
extension_commands = {
c: e
@@ -541,8 +541,13 @@ def print_version():
# Let's use the standard Python mechanism if underlying module
# did not provide __version__
try:
- import pkg_resources
- version = pkg_resources.get_distribution(mod_name).version
+ if sys.version_info < (3, 10):
+ import importlib_metadata as im
+ else:
+ import importlib.metadata as im
+
+ pkg = im.packages_distributions()[mod_name][0]
+ version = im.version(pkg)
except Exception:
version = "unknown"
if include_name:
diff --git a/datalad/core/distributed/clone_ephemeral.py b/datalad/core/distributed/clone_ephemeral.py
index 02082e9274..1c13382754 100644
--- a/datalad/core/distributed/clone_ephemeral.py
+++ b/datalad/core/distributed/clone_ephemeral.py
@@ -92,6 +92,10 @@ def _setup_ephemeral_annex(ds: Dataset, remote: str):
# If origin isn't local, we have nothing to do.
origin_git_path = Path(RI(origin_annex_url).localpath)
+ if not origin_git_path.is_absolute():
+ # relative path would be relative to the ds, not pwd!
+ origin_git_path = ds.pathobj / origin_git_path
+
# we are local; check for a bare repo first to not mess w/
# the path
if GitRepo(origin_git_path, create=False).bare:
diff --git a/datalad/core/distributed/tests/test_clone.py b/datalad/core/distributed/tests/test_clone.py
index 4980332b3b..d5fd1b7efe 100644
--- a/datalad/core/distributed/tests/test_clone.py
+++ b/datalad/core/distributed/tests/test_clone.py
@@ -1365,42 +1365,38 @@ def test_ria_http_storedataladorg(path=None):
@with_tempfile
@with_tempfile
@with_tempfile
+@with_tempfile
def test_ephemeral(origin_path=None, bare_path=None,
- clone1_path=None, clone2_path=None, clone3_path=None):
+ clone1_path=None, clone2_path=None,
+ clone3_path=None, clone4_path=None):
+ can_symlink = has_symlink_capability()
file_test = Path('ds') / 'test.txt'
file_testsub = Path('ds') / 'subdir' / 'testsub.txt'
origin = Dataset(origin_path).create(force=True)
origin.save()
- # 1. clone via path
- clone1 = clone(origin_path, clone1_path, reckless='ephemeral')
- eq_(clone1.config.get("annex.private"), "true")
- can_symlink = has_symlink_capability()
+ def check_clone(clone_):
+ # common checks to do on a clone
+ eq_(clone_.config.get("annex.private"), "true")
+ if can_symlink:
+ clone_annex = (clone_.repo.dot_git / 'annex')
+ ok_(clone_annex.is_symlink())
+ ok_(clone_annex.resolve().samefile(origin.repo.dot_git / 'annex'))
+ if not clone_.repo.is_managed_branch():
+ # TODO: We can't properly handle adjusted branch yet
+ eq_((clone_.pathobj / file_test).read_text(), 'some')
+ eq_((clone_.pathobj / file_testsub).read_text(), 'somemore')
- if can_symlink:
- clone1_annex = (clone1.repo.dot_git / 'annex')
- ok_(clone1_annex.is_symlink())
- ok_(clone1_annex.resolve().samefile(origin.repo.dot_git / 'annex'))
- if not clone1.repo.is_managed_branch():
- # TODO: We can't properly handle adjusted branch yet
- eq_((clone1.pathobj / file_test).read_text(), 'some')
- eq_((clone1.pathobj / file_testsub).read_text(), 'somemore')
+ # 1. clone via path
+ clone1 = clone(origin_path, clone1_path, reckless='ephemeral')
+ check_clone(clone1)
# 2. clone via file-scheme URL
clone2 = clone('file://' + Path(origin_path).as_posix(), clone2_path,
reckless='ephemeral')
- eq_(clone2.config.get("annex.private"), "true")
-
- if can_symlink:
- clone2_annex = (clone2.repo.dot_git / 'annex')
- ok_(clone2_annex.is_symlink())
- ok_(clone2_annex.resolve().samefile(origin.repo.dot_git / 'annex'))
- if not clone2.repo.is_managed_branch():
- # TODO: We can't properly handle adjusted branch yet
- eq_((clone2.pathobj / file_test).read_text(), 'some')
- eq_((clone2.pathobj / file_testsub).read_text(), 'somemore')
+ check_clone(clone2)
# 3. add something to clone1 and push back to origin availability from
# clone1 should not be propagated (we declared 'here' dead to that end)
@@ -1456,6 +1452,12 @@ def test_ephemeral(origin_path=None, bare_path=None,
ok_(eph_annex.is_symlink())
ok_(eph_annex.resolve().samefile(Path(bare_path) / 'annex'))
+ # 5. ephemeral clone using relative path
+ # https://github.com/datalad/datalad/issues/7469
+ with chpwd(op.dirname(origin_path)):
+ clone4 = clone(op.basename(origin_path), op.basename(clone4_path), reckless='ephemeral')
+ check_clone(clone4)
+
@with_tempfile(mkdir=True)
def test_clone_unborn_head(path=None):
diff --git a/datalad/core/local/save.py b/datalad/core/local/save.py
index 903a0ce5b3..6a193f2aee 100644
--- a/datalad/core/local/save.py
+++ b/datalad/core/local/save.py
@@ -13,9 +13,16 @@
__docformat__ = 'restructuredtext'
import logging
-
from functools import partial
+from pathlib import Path
+import datalad.utils as ut
+from datalad.distribution.dataset import (
+ Dataset,
+ EnsureDataset,
+ datasetmethod,
+ require_dataset,
+)
from datalad.interface.base import (
Interface,
build_doc,
@@ -23,38 +30,27 @@
)
from datalad.interface.common_opts import (
jobs_opt,
- recursion_limit,
recursion_flag,
+ recursion_limit,
save_message_opt,
)
from datalad.interface.utils import (
- get_tree_roots,
discover_dataset_trace_to_targets,
+ get_tree_roots,
)
-from datalad.support.param import Parameter
from datalad.support.constraints import (
- EnsureStr,
EnsureNone,
+ EnsureStr,
)
from datalad.support.exceptions import CommandError
from datalad.support.parallel import (
- no_subds_in_futures,
ProducerConsumerProgressLog,
+ no_subds_in_futures,
)
-from datalad.utils import (
- ensure_list,
-)
-import datalad.utils as ut
+from datalad.support.param import Parameter
+from datalad.utils import ensure_list
-from datalad.distribution.dataset import (
- Dataset,
- EnsureDataset,
- datasetmethod,
- require_dataset,
-)
-from .status import (
- Status,
-)
+from .status import Status
lgr = logging.getLogger('datalad.core.local.save')
@@ -328,7 +324,7 @@ def save_ds(args, version_tag=None):
if k in res:
res[k] = str(
# recode path back to dataset path anchor
- pds.pathobj / res[k].relative_to(
+ pds.pathobj / Path(res[k]).relative_to(
pds_repo.pathobj)
)
yield res
diff --git a/datalad/distributed/create_sibling_gitlab.py b/datalad/distributed/create_sibling_gitlab.py
index 98ac2f4fa3..e4c31c8816 100644
--- a/datalad/distributed/create_sibling_gitlab.py
+++ b/datalad/distributed/create_sibling_gitlab.py
@@ -298,16 +298,21 @@ def __call__(
return_type='list')
if not subds:
# we didn't find anything to operate on, let the user know
- for p in path:
+ res_kwargs = {'status': 'impossible', 'refds': ds.path,
+ 'type':'dataset', 'logger': lgr,
+ 'action': 'create_sibling_gitlab'}
+ if path is not None:
+ for p in path:
+ yield dict(
+ path=p,
+ message=('No installed dataset found under %s, forgot to "get" it?' % p),
+ **res_kwargs
+ )
+ else:
yield dict(
- action='create_sibling_gitlab',
- status='impossible',
- refds=ds.path,
- path=p,
- message=('No dataset found under %s' % p),
- type='dataset',
- logger=lgr,
- )
+ path=ds.path,
+ message=('No installed subdatasets found underneath %s, forgot to "get" any?' % ds.path),
+ **res_kwargs)
else:
for sub in subds:
for r in _proc_dataset(
diff --git a/datalad/distributed/tests/test_create_sibling_gitlab.py b/datalad/distributed/tests/test_create_sibling_gitlab.py
index 497a33abab..a2c9a5d130 100644
--- a/datalad/distributed/tests/test_create_sibling_gitlab.py
+++ b/datalad/distributed/tests/test_create_sibling_gitlab.py
@@ -8,7 +8,7 @@
"""Test create publication target on gitlab"""
import os
-
+import pytest
# this must import ok with and without gitlab
from datalad.api import (
Dataset,
@@ -291,6 +291,16 @@ def test_dryrun(path=None):
'secret/subdir-collection1-sub2',
],
)
+ # test for #7429: when a subdataset is uninstalled, recursion must
+ # not crash with KeyError
+ ctlg['root'].drop(['subdir/collection1', 'collection2'],
+ what='datasets', recursive=True, reckless='kill')
+ try:
+ res = ctlg['root'].create_sibling_gitlab(
+ recursive=True, layout='collection', dry_run=True,
+ on_failure='ignore')
+ except TypeError:
+ pytest.fail("Crashed with TypeError on uninstalled datasets")
class _FakeGitLab(object):
diff --git a/datalad/distribution/get.py b/datalad/distribution/get.py
index 72e6b3ae2a..a5a927548a 100644
--- a/datalad/distribution/get.py
+++ b/datalad/distribution/get.py
@@ -146,7 +146,7 @@ def _get_flexible_source_candidates_for_submodule(ds, sm):
under the shortened name `id`.
Additionally, the URL of any configured remote that contains the respective
- submodule commit is available as `remote-` properties, where `name`
+ submodule commit is available as `remoteurl-` property, where `name`
is the configured remote name.
Lastly, all candidates are sorted according to their cost (lower values
@@ -752,16 +752,20 @@ class Get(Interface):
cost is given in parenthesis, higher values indicate higher cost, and thus
lower priority:
+ - A datalad URL recorded in `.gitmodules` (cost 590). This allows for
+ datalad URLs that require additional handling/resolution by datalad, like
+ ria-schemes (ria+http, ria+ssh, etc.)
+
+ - A URL or absolute path recorded for git in `.gitmodules` (cost 600).
+
- URL of any configured superdataset remote that is known to have the
desired submodule commit, with the submodule path appended to it.
- There can be more than one candidate (cost 500).
+ There can be more than one candidate (cost 650).
- In case `.gitmodules` contains a relative path instead of a URL,
the URL of any configured superdataset remote that is known to have the
desired submodule commit, with this relative path appended to it.
- There can be more than one candidate (cost 500).
-
- - A URL or absolute path recorded in `.gitmodules` (cost 600).
+ There can be more than one candidate (cost 650).
- In case `.gitmodules` contains a relative path as a URL, the absolute
path of the superdataset, appended with this relative path (cost 900).
@@ -784,7 +788,7 @@ class Get(Interface):
under the shortened name `id`.
Additionally, the URL of any configured remote that contains the respective
- submodule commit is available as `remote-` properties, where `name`
+ submodule commit is available as `remoteurl-` property, where `name`
is the configured remote name.
Hence, such a template could be `http://example.org/datasets/{id}` or
diff --git a/datalad/distribution/install.py b/datalad/distribution/install.py
index 37ed6abff9..cf5ce37b7d 100644
--- a/datalad/distribution/install.py
+++ b/datalad/distribution/install.py
@@ -61,16 +61,27 @@
@build_doc
class Install(Interface):
- """Install a dataset from a (remote) source.
+ """Install one or many datasets from remote URL(s) or local PATH source(s).
- This command creates a local :term:`sibling` of an existing dataset from a
- (remote) location identified via a URL or path. Optional recursion into
+ This command creates local :term:`sibling`\(s) of existing dataset(s) from
+ (remote) locations specified as URL(s) or path(s). Optional recursion into
potential subdatasets, and download of all referenced data is supported.
- The new dataset can be optionally registered in an existing
+ The new dataset(s) can be optionally registered in an existing
:term:`superdataset` by identifying it via the `dataset` argument (the new
dataset's path needs to be located within the superdataset for that).
- It is recommended to provide a brief description to label the dataset's
+ || REFLOW >>
+ If no explicit [CMD: -s|--source CMD][PY: `source` PY] option is specified,
+ then all positional URL-OR-PATH
+ arguments are considered to be "sources" if they are URLs or target locations
+ if they are paths.
+ If a target location path corresponds to a submodule, the source location for it
+ is figured out from its record in the `.gitmodules`.
+ If [CMD: -s|--source CMD][PY: `source` PY] is specified, then a single optional
+ positional PATH would be taken as the destination path for that dataset.
+ << REFLOW ||
+
+ It is possible to provide a brief description to label the dataset's
nature *and* location, e.g. "Michael's music on black laptop". This helps
humans to identify data locations in distributed scenarios. By default an
identifier comprised of user and machine name, plus path will be generated.
@@ -113,13 +124,15 @@ class Install(Interface):
code_cmd="""\
datalad install -d . \\
--source='https://github.com/datalad-datasets/longnow-podcasts.git'"""),
- dict(text="Install a dataset, and get all content right away",
+ dict(text="Install a dataset into 'podcasts' (not 'longnow-podcasts') directory,"
+ " and get all content right away",
code_py="""\
- install(source='https://github.com/datalad-datasets/longnow-podcasts.git',
+ install(path='podcasts',
+ source='https://github.com/datalad-datasets/longnow-podcasts.git',
get_data=True)""",
code_cmd="""\
datalad install --get-data \\
- -s https://github.com/datalad-datasets/longnow-podcasts.git"""),
+ -s https://github.com/datalad-datasets/longnow-podcasts.git podcasts"""),
dict(text="Install a dataset with all its subdatasets",
code_py="""\
install(source='https://github.com/datalad-datasets/longnow-podcasts.git',
@@ -143,7 +156,7 @@ class Install(Interface):
constraints=EnsureDataset() | EnsureNone()),
path=Parameter(
args=("path",),
- metavar='PATH',
+ metavar='URL-OR-PATH',
nargs="*",
# doc: TODO
doc="""path/name of the installation target. If no `path` is
@@ -151,7 +164,7 @@ class Install(Interface):
similar to :command:`git clone`"""),
source=Parameter(
args=("-s", "--source"),
- metavar='SOURCE',
+ metavar='URL-OR-PATH',
doc="URL or local path of the installation source",
constraints=EnsureStr() | EnsureNone()),
branch=Parameter(
@@ -256,6 +269,7 @@ def __call__(
result_renderer='disabled',
result_xfm=None,
result_filter=None,
+ branch=branch,
**common_kwargs):
# no post-processing of the installed content on disk
# should be necessary here, all done by code further
diff --git a/datalad/distribution/tests/test_install.py b/datalad/distribution/tests/test_install.py
index afd68e1e0e..2741ff962e 100644
--- a/datalad/distribution/tests/test_install.py
+++ b/datalad/distribution/tests/test_install.py
@@ -986,8 +986,9 @@ def test_relpath_semantics(path=None):
eq_(sub.path, op.join(super.path, 'sub'))
-@with_tempfile
-def test_install_branch(path=None):
+@with_tempfile(mkdir=True)
+@serve_path_via_http
+def test_install_branch(path=None, url=None):
path = Path(path)
ds_a = create(path / "ds_a")
ds_a.create("sub")
@@ -998,6 +999,16 @@ def test_install_branch(path=None):
repo_a.commit(msg="c2", options=["--allow-empty"])
repo_a.checkout(DEFAULT_BRANCH)
+ # Clone from URL with custom branch specified should work
+ assert ds_a.repo.call_git_success(['update-server-info'])
+ tmp_path = path / "tmp"
+ os.mkdir(tmp_path)
+ with chpwd(tmp_path):
+ ds_b = install(url + "ds_a/.git", branch=DEFAULT_BRANCH + "-other")
+ repo_b = ds_b.repo
+ eq_(repo_b.get_corresponding_branch() or repo_b.get_active_branch(),
+ DEFAULT_BRANCH + "-other")
+
ds_b = install(source=ds_a.path, path=str(path / "ds_b"),
branch=DEFAULT_BRANCH + "-other", recursive=True)
diff --git a/datalad/downloaders/tests/test_s3.py b/datalad/downloaders/tests/test_s3.py
index f25faa42f5..bdc8958548 100644
--- a/datalad/downloaders/tests/test_s3.py
+++ b/datalad/downloaders/tests/test_s3.py
@@ -49,7 +49,8 @@
url_dandi1 = 's3://dandiarchive/dandiarchive/dandiarchive/data/d8dd3e2b-8f74-494b-9370-9e3a6c69e2b0.csv.gz?versionId=9P7aMTvTT5wynPBOtiQqkV.wvV8zcpLf'
-@use_cassette('test_s3_download_basic')
+# disabled due to https://github.com/datalad/datalad/issues/7465
+# @use_cassette('test_s3_download_basic')
@pytest.mark.parametrize("url,success_str,failed_str", [
(url_2versions_nonversioned1, 'version2', 'version1'),
(url_2versions_nonversioned1_ver2, 'version2', 'version1'),
@@ -156,7 +157,8 @@ def test_boto_host_specification(tempfile=None):
assert_equal(md5sum(tempfile), '97f4290b2d369816c052607923e372d4')
-def test_restricted_bucket_on_NDA():
+# disabled due to https://github.com/datalad/datalad/issues/7464
+def disabled_test_restricted_bucket_on_NDA():
get_test_providers('s3://NDAR_Central_4/', reload=True) # to verify having credentials to access
for url, success_str, failed_str in [
("s3://NDAR_Central_4/submission_23075/README", 'BIDS', 'error'),
@@ -165,9 +167,10 @@ def test_restricted_bucket_on_NDA():
check_download_external_url(url, failed_str, success_str)
+# disabled due to https://github.com/datalad/datalad/issues/7464
@use_cassette('test_download_multiple_NDA')
@with_tempfile(mkdir=True)
-def test_download_multiple_NDA(outdir=None):
+def disabled_test_download_multiple_NDA(outdir=None):
# This would smoke/integration test logic for composite credential testing expiration
# of the token while reusing session from first url on the 2nd one
urls = [
@@ -180,9 +183,11 @@ def test_download_multiple_NDA(outdir=None):
ret = providers.download(url, outdir)
-@use_cassette('test_get_key')
+# disabled due to https://github.com/datalad/datalad/issues/7465
+# @use_cassette('test_get_key')
@pytest.mark.parametrize("b,key,version_id", [
- ('NDAR_Central_4', 'submission_23075/README', None),
+ # disabled due to https://github.com/datalad/datalad/issues/7464
+ # ('NDAR_Central_4', 'submission_23075/README', None),
('datalad-test0-versioned', '1version-nonversioned1.txt', None),
('datalad-test0-versioned', '3versions-allversioned.txt', None),
('datalad-test0-versioned', '3versions-allversioned.txt', 'pNsV5jJrnGATkmNrP8.i_xNH6CY4Mo5s'),
diff --git a/datalad/interface/results.py b/datalad/interface/results.py
index f40ac6aff6..c5c5567e24 100644
--- a/datalad/interface/results.py
+++ b/datalad/interface/results.py
@@ -10,31 +10,41 @@
"""
+from __future__ import annotations
+
__docformat__ = 'restructuredtext'
import logging
-
+from collections.abc import (
+ Iterable,
+ Iterator,
+)
from os.path import (
isabs,
isdir,
- join as opj,
+)
+from os.path import join as opj
+from os.path import (
normpath,
relpath,
)
+from typing import (
+ Any,
+ Optional,
+)
+from datalad.distribution.dataset import Dataset
+from datalad.support.exceptions import (
+ CapturedException,
+ CommandError,
+ format_oneline_tb,
+)
+from datalad.support.path import robust_abspath
from datalad.utils import (
+ PurePosixPath,
ensure_list,
path_is_subpath,
- PurePosixPath,
-)
-from datalad.support.exceptions import CommandError
-from datalad.support.path import robust_abspath
-from datalad.support.exceptions import (
- format_oneline_tb,
- CapturedException,
)
-from datalad.distribution.dataset import Dataset
-
lgr = logging.getLogger('datalad.interface.results')
lgr.log(5, "Importing datalad.interface.results")
@@ -48,9 +58,19 @@
}
-def get_status_dict(action=None, ds=None, path=None, type=None, logger=None,
- refds=None, status=None, message=None, exception=None,
- error_message=None, **kwargs):
+def get_status_dict(
+ action: Optional[str] = None,
+ ds: Optional[Dataset] = None,
+ path: Optional[str] = None,
+ type: Optional[str] = None,
+ logger: Optional[logging.Logger] = None,
+ refds: Optional[str] = None,
+ status: Optional[str] = None,
+ message: str | tuple | None = None,
+ exception: Exception | CapturedException | None = None,
+ error_message: str | tuple | None = None,
+ **kwargs: Any,
+) -> dict[str, Any]:
# `type` is intentionally not `type_` or something else, as a mismatch
# with the dict key 'type' causes too much pain all over the place
# just for not shadowing the builtin `type` in this function
@@ -62,11 +82,11 @@ def get_status_dict(action=None, ds=None, path=None, type=None, logger=None,
Parameters
----------
- ds : Dataset instance
+ ds
If given, the `path` and `type` values are populated with the path of the
datasets and 'dataset' as the type. Giving additional values for both
keys will overwrite these pre-populated values.
- exception : Exception
+ exception
Exceptions that occurred while generating a result should be captured
by immediately instantiating a CapturedException. This instance can
be passed here to yield more comprehensive error reporting, including
@@ -78,7 +98,7 @@ def get_status_dict(action=None, ds=None, path=None, type=None, logger=None,
dict
"""
- d = {}
+ d: dict[str, Any] = {}
if action is not None:
d['action'] = action
if ds:
@@ -116,8 +136,16 @@ def get_status_dict(action=None, ds=None, path=None, type=None, logger=None,
return d
-def results_from_paths(paths, action=None, type=None, logger=None, refds=None,
- status=None, message=None, exception=None):
+def results_from_paths(
+ paths: str | list[str],
+ action: Optional[str] = None,
+ type: Optional[str] = None,
+ logger: Optional[logging.Logger] = None,
+ refds: Optional[str]=None,
+ status: Optional[str] = None,
+ message: Optional[str] = None,
+ exception: Exception | CapturedException | None = None,
+) -> Iterator[dict[str, Any]]:
"""
Helper to yield analog result dicts for each path in a sequence.
@@ -135,19 +163,20 @@ def results_from_paths(paths, action=None, type=None, logger=None, refds=None,
for p in ensure_list(paths):
yield get_status_dict(
action, path=p, type=type, logger=logger, refds=refds,
- status=status, message=(message, p) if '%s' in message else message,
+ status=status, message=(message, p) if message is not None and '%s' in message else message,
exception=exception
)
-def is_ok_dataset(r):
+def is_ok_dataset(r: dict) -> bool:
"""Convenience test for a non-failure dataset-related result dict"""
return r.get('status', None) == 'ok' and r.get('type', None) == 'dataset'
-class ResultXFM(object):
+class ResultXFM:
"""Abstract definition of the result transformer API"""
- def __call__(self, res):
+
+ def __call__(self, res: dict[str, Any]) -> Any:
"""This is called with one result dict at a time"""
raise NotImplementedError
@@ -160,16 +189,19 @@ class YieldDatasets(ResultXFM):
`None` is returned for any other result.
"""
- def __init__(self, success_only=False):
+ def __init__(self, success_only: bool = False) -> None:
self.success_only = success_only
- def __call__(self, res):
+ def __call__(self, res: dict[str, Any]) -> Optional[Dataset]:
if res.get('type', None) == 'dataset':
if not self.success_only or \
res.get('status', None) in ('ok', 'notneeded'):
return Dataset(res['path'])
+ else:
+ return None
else:
lgr.debug('rejected by return value configuration: %s', res)
+ return None
class YieldRelativePaths(ResultXFM):
@@ -178,15 +210,17 @@ class YieldRelativePaths(ResultXFM):
Relative paths are determined from the 'refds' value in the result. If
no such value is found, `None` is returned.
"""
- def __call__(self, res):
+ def __call__(self, res: dict[str, Any]) -> Optional[str]:
refpath = res.get('refds', None)
if refpath:
return relpath(res['path'], start=refpath)
+ else:
+ return None
class YieldField(ResultXFM):
"""Result transformer to return an arbitrary value from a result dict"""
- def __init__(self, field):
+ def __init__(self, field: str) -> None:
"""
Parameters
----------
@@ -195,11 +229,12 @@ def __init__(self, field):
"""
self.field = field
- def __call__(self, res):
+ def __call__(self, res: dict[str, Any]) -> Any:
if self.field in res:
return res[self.field]
else:
lgr.debug('rejected by return value configuration: %s', res)
+ return None
# a bunch of convenience labels for common result transformers
@@ -219,7 +254,7 @@ def __call__(self, res):
}
-def annexjson2result(d, ds, **kwargs):
+def annexjson2result(d: dict[str, Any], ds: Dataset, **kwargs: Any) -> dict[str, Any]:
"""Helper to convert an annex JSON result to a datalad result dict
Info from annex is rather heterogeneous, partly because some of it
@@ -271,13 +306,13 @@ def annexjson2result(d, ds, **kwargs):
return res
-def count_results(res, **kwargs):
- """Return number if results that match all property values in kwargs"""
+def count_results(res: Iterable[dict[str, Any]], **kwargs: Any) -> int:
+ """Return number of results that match all property values in kwargs"""
return sum(
all(k in r and r[k] == v for k, v in kwargs.items()) for r in res)
-def only_matching_paths(res, **kwargs):
+def only_matching_paths(res: dict[str, Any], **kwargs: Any) -> bool:
# TODO handle relative paths by using a contained 'refds' value
paths = ensure_list(kwargs.get('path', []))
respath = res.get('path', None)
@@ -285,8 +320,8 @@ def only_matching_paths(res, **kwargs):
# needs decorator, as it will otherwise bind to the command classes that use it
-@staticmethod
-def is_result_matching_pathsource_argument(res, **kwargs):
+@staticmethod # type: ignore[misc]
+def is_result_matching_pathsource_argument(res: dict[str, Any], **kwargs: Any) -> bool:
# we either have any non-zero number of "paths" (that could be anything), or
# we have one path and one source
# we don't do any error checking here, done by the command itself
@@ -331,9 +366,16 @@ def is_result_matching_pathsource_argument(res, **kwargs):
return False
-def results_from_annex_noinfo(ds, requested_paths, respath_by_status, dir_fail_msg,
- noinfo_dir_msg, noinfo_file_msg, noinfo_status='notneeded',
- **kwargs):
+def results_from_annex_noinfo(
+ ds: Dataset,
+ requested_paths: list[str],
+ respath_by_status: dict[str, list[str]],
+ dir_fail_msg: str,
+ noinfo_dir_msg: str,
+ noinfo_file_msg: str,
+ noinfo_status: str = 'notneeded',
+ **kwargs: Any
+) -> Iterator[dict[str, Any]]:
"""Helper to yield results based on what information git annex did no give us.
The helper assumes that the annex command returned without an error code,
diff --git a/datalad/local/run_procedure.py b/datalad/local/run_procedure.py
index 3c3051534b..610b9c0bf0 100644
--- a/datalad/local/run_procedure.py
+++ b/datalad/local/run_procedure.py
@@ -48,6 +48,17 @@
split_cmdline,
)
+if sys.version_info < (3, 9):
+ from importlib_resources import (
+ as_file,
+ files,
+ )
+else:
+ from importlib.resources import (
+ as_file,
+ files,
+ )
+
lgr = logging.getLogger('datalad.local.run_procedures')
@@ -148,23 +159,17 @@ def _get_procedure_implementation(name='*', ds=None):
# 3. check extensions for procedure
from datalad.support.entrypoints import iter_entrypoints
- # delay heavy import until here
- from pkg_resources import (
- resource_filename,
- resource_isdir,
- )
+
for epname, epmodule, _ in iter_entrypoints('datalad.extensions'):
- # use of '/' here is OK wrt to platform compatibility
- if resource_isdir(epmodule, 'resources/procedures'):
- for m, n in _get_file_match(
- resource_filename(epmodule, 'resources/procedures'),
- name):
- yield (m, n,) + _get_proc_config(n)
+ res = files(epmodule) / "resources" / "procedures"
+ if res.is_dir():
+ with as_file(res) as p:
+ for m, n in _get_file_match(p, name):
+ yield (m, n,) + _get_proc_config(n)
# 4. at last check datalad itself for procedure
- for m, n in _get_file_match(
- resource_filename('datalad', 'resources/procedures'),
- name):
- yield (m, n,) + _get_proc_config(n)
+ with as_file(files("datalad") / "resources" / "procedures") as p:
+ for m, n in _get_file_match(p, name):
+ yield (m, n,) + _get_proc_config(n)
def _guess_exec(script_file):
diff --git a/datalad/runner/runner.py b/datalad/runner/runner.py
index ebaec91670..32c8074ee3 100644
--- a/datalad/runner/runner.py
+++ b/datalad/runner/runner.py
@@ -63,14 +63,33 @@ def __init__(self,
def _get_adjusted_env(self,
env: dict | None = None,
cwd: str | PathLike | None = None,
- copy: bool = True):
- """Return an adjusted copy of an execution environment
+ copy: bool = True
+ ) -> dict | None:
+ """Return an adjusted execution environment
- Or return an unaltered copy of the environment, if no adjustments
- need to be made.
+ This method adjusts the environment provided in `env` to
+ reflect the configuration of the runner. It returns
+ an altered copy or an altered original, if `copy` is
+ `False`.
+
+ Parameters
+ ----------
+ env
+ The environment that should be adjusted
+
+ cwd: str | PathLike | None (default: None)
+ If not None, the content of this variable will be
+ put into the environment variable 'PWD'.
+
+ copy: bool (default: True)
+ if True, the returned environment will be a
+ copy of `env`. Else the passed in environment
+ is modified. Note: if `env` is not `None` and
+ `cwd` is `None` and `copy` is `True`, the
+ returned dictionary is still a copy
"""
- env = env.copy() if env else None
- if cwd and env is not None:
+ env = env.copy() if env is not None and copy is True else env
+ if cwd is not None and env is not None:
# If an environment and 'cwd' is provided, ensure the 'PWD' in the
# environment is set to the value of 'cwd'.
env['PWD'] = str(cwd)
diff --git a/datalad/runner/tests/test_witless_runner.py b/datalad/runner/tests/test_witless_runner.py
index 61257210d5..6e0e5a2d60 100644
--- a/datalad/runner/tests/test_witless_runner.py
+++ b/datalad/runner/tests/test_witless_runner.py
@@ -335,9 +335,38 @@ def test_path_to_str_conversion() -> None:
cwd=test_path,
env=dict(some_key="value")
)
+ assert adjusted_env is not None
assert str(test_path) == adjusted_env['PWD']
+def test_env_copying() -> None:
+ # Regression test to ensure environments are only copied
+ # if `copy=True` is given to `Runner._get_adjusted_env.`
+ # Test also for path adjustments, if not-`None` `pwd`-value
+ # is given to `Runner._get_adjusted_env`.
+ runner = Runner()
+ for original_env in (None, dict(some_key='value')):
+ for cwd in (None, Path('a/b/c')):
+ for do_copy in (True, False):
+ adjusted_env = runner._get_adjusted_env(
+ cwd=cwd,
+ env=original_env,
+ copy=do_copy
+ )
+ if original_env is None:
+ assert adjusted_env is None
+ else:
+ assert adjusted_env is not None
+ if do_copy is True:
+ assert adjusted_env is not original_env
+ else:
+ assert adjusted_env is original_env
+ if cwd is None:
+ assert 'PWD' not in adjusted_env
+ else:
+ assert 'PWD' in adjusted_env
+
+
@with_tempfile(mkdir=True)
def test_environment(temp_dir_path: str = "") -> None:
# Ensure that the subprocess sees a string in `$PWD`, even if a Path-object
diff --git a/datalad/support/external_versions.py b/datalad/support/external_versions.py
index 36c1797b1e..3abe25cf6c 100644
--- a/datalad/support/external_versions.py
+++ b/datalad/support/external_versions.py
@@ -8,18 +8,19 @@
# ## ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ##
"""Module to help maintain a registry of versions for external modules etc
"""
+import os.path as op
import re
import sys
-import os.path as op
+from itertools import chain
from os import linesep
-from itertools import chain
from looseversion import LooseVersion
-from datalad.log import lgr
# import version helper from config to have only one implementation
# config needs this to avoid circular imports
from datalad.config import get_git_version as __get_git_version
+from datalad.log import lgr
+
from .exceptions import (
CapturedException,
CommandError,
@@ -46,14 +47,15 @@ def __cmp__(self, other):
# Custom handlers
#
from datalad.cmd import (
- WitlessRunner,
GitWitlessRunner,
StdOutErrCapture,
+ WitlessRunner,
)
from datalad.support.exceptions import (
MissingExternalDependency,
OutdatedExternalDependency,
)
+
_runner = WitlessRunner()
_git_runner = GitWitlessRunner()
@@ -156,7 +158,10 @@ def get_rsync_version():
# that of the debian package it's installed with. Reason is in gh-7320,
# which results in the need to detect a patched-by-ubuntu version of rsync
# and therefore the package version, not the result of `rsync --version`.
- from datalad.utils import on_linux, get_linux_distribution
+ from datalad.utils import (
+ get_linux_distribution,
+ on_linux,
+ )
if on_linux:
dist = get_linux_distribution()[0]
if dist in ['debian', 'ubuntu']:
@@ -239,12 +244,16 @@ def _deduce_version(klass, value):
version = getattr(value, attr)
break
- # try pkg_resources
+ # try importlib.metadata
if version is None and hasattr(value, '__name__'):
pkg_name = klass._PYTHON_PACKAGES.get(value.__name__, value.__name__)
try:
- import pkg_resources
- version = pkg_resources.get_distribution(pkg_name).version
+ if sys.version_info < (3, 10):
+ import importlib_metadata as im
+ else:
+ import importlib.metadata as im
+
+ version = im.version(pkg_name)
except Exception:
pass
diff --git a/datalad/support/gitrepo.py b/datalad/support/gitrepo.py
index 88e5d2ee4d..b095d4e3a5 100644
--- a/datalad/support/gitrepo.py
+++ b/datalad/support/gitrepo.py
@@ -3513,7 +3513,7 @@ def save_(self, message: Optional[str] = None, paths: Optional[list[Path]] = Non
action='delete',
refds=self.pathobj,
type=props.get('type'),
- path=p,
+ path=str(p),
status='ok',
logger=lgr)
@@ -3716,7 +3716,7 @@ def _save_add(self, files: dict[str, Any], git_opts: Optional[list[str]] = None)
# get all the entries
for r in self._process_git_get_output(*add_out):
yield get_status_dict(
- action=r.get('command', 'add'),
+ action=str(r.get('command', 'add')),
refds=self.pathobj,
type='file',
path=(self.pathobj / ut.PurePosixPath(r['file']))
@@ -3772,7 +3772,7 @@ def _save_add_submodules(self, paths: list[Path] | dict[Path, dict]) -> Iterator
yield get_status_dict(
action='add_submodule',
ds=self,
- path=path,
+ path=str(path),
status='error',
message=('cannot add subdataset %s with no commits', subm),
logger=lgr)
diff --git a/datalad/support/tests/test_repo_save.py b/datalad/support/tests/test_repo_save.py
index 692c67e525..4973e46f9c 100644
--- a/datalad/support/tests/test_repo_save.py
+++ b/datalad/support/tests/test_repo_save.py
@@ -52,7 +52,7 @@ def _test_save_all(path, repocls):
# make sure we get a 'delete' result for each deleted file
eq_(
set(r['path'] for r in res if r['action'] == 'delete'),
- {k for k, v in orig_status.items()
+ {str(k) for k, v in orig_status.items()
if k.name in ('file_deleted', 'file_staged_deleted')}
)
saved_status = ds.repo.status(untracked='all')
diff --git a/datalad/tests/test_archives.py b/datalad/tests/test_archives.py
index 04055dcc6e..ade7a70caa 100644
--- a/datalad/tests/test_archives.py
+++ b/datalad/tests/test_archives.py
@@ -87,17 +87,17 @@ def check_decompress_file(leading_directories, path=None):
path_archive_obscure = op.join(outdir, fn_archive_obscure)
if leading_directories == 'strip':
- assert_false(op.exists(path_archive_obscure))
+ assert not op.exists(path_archive_obscure)
testpath = outdir
elif leading_directories is None:
- assert_true(op.exists(path_archive_obscure))
+ assert op.exists(path_archive_obscure)
testpath = path_archive_obscure
else:
raise NotImplementedError("Dunno about this strategy: %s"
% leading_directories)
- assert_true(op.exists(op.join(testpath, '3.txt')))
- assert_true(op.exists(op.join(testpath, fn_in_archive_obscure)))
+ assert op.exists(op.join(testpath, '3.txt'))
+ assert op.exists(op.join(testpath, fn_in_archive_obscure))
with open(op.join(testpath, '3.txt')) as f:
eq_(f.read(), '3 load')
diff --git a/docs/source/changelog.rst b/docs/source/changelog.rst
index 6cfc9d0c06..205cead09d 100644
--- a/docs/source/changelog.rst
+++ b/docs/source/changelog.rst
@@ -2,9 +2,96 @@
Change log
**********
+0.19.3 (2023-08-10)
+===================
+
+Bug Fixes
+---------
+
+- Type annotate get_status_dict and note that we can pass Exception or
+ CapturedException which is not subclass. `PR
+ #7403 `__ (by
+ `@yarikoptic `__)
+
+- BF: create-sibling-gitlab used to raise a TypeError when attempting a
+ recursive operation in a dataset with uninstalled subdatasets. It now
+ raises an impossible result instead. `PR
+ #7430 `__ (by
+ `@adswa `__)
+
+- Pass branch option into recursive call within Install - for the cases
+ whenever install is invoked with URL(s). Fixes
+ `#7461 `__ via `PR
+ #7463 `__ (by
+ `@yarikoptic `__)
+
+- Allow for reckless=ephemeral clone using relative path for the
+ original location. Fixes
+ `#7469 `__ via `PR
+ #7472 `__ (by
+ `@yarikoptic `__)
+
+Documentation
+-------------
+
+- Fix a property name and default costs described in “getting
+ subdatasets” section of ``get`` documentation. Fixes
+ `#7458 `__ via `PR
+ #7460 `__ (by
+ `@mslw `__)
+
+Internal
+--------
+
+- Copy an adjusted environment only if requested to do so. `PR
+ #7399 `__ (by
+ `@christian-monch `__)
+
+- Eliminate uses of ``pkg_resources``. Fixes
+ `#7435 `__ via `PR
+ #7439 `__ (by
+ `@jwodder `__)
+
+Tests
+-----
+
+- Disable some S3 tests of their VCR taping where they fail for known
+ issues. `PR #7467 `__
+ (by `@yarikoptic `__)
+
+.. _section-1:
+
+0.19.2 (2023-07-03)
+===================
+
+.. _bug-fixes-1:
+
+Bug Fixes
+---------
+
+- Remove surrounding quotes in output filenames even for newer version
+ of annex. Fixes
+ `#7440 `__ via `PR
+ #7443 `__ (by
+ `@yarikoptic `__)
+
+.. _documentation-1:
+
+Documentation
+-------------
+
+- DOC: clarify description of the “install” interface to reflect its
+ convoluted behavior. `PR
+ #7445 `__ (by
+ `@yarikoptic `__)
+
+.. _section-2:
+
0.19.1 (2023-06-26)
===================
+.. _internal-1:
+
Internal
--------
@@ -14,6 +101,8 @@ Internal
#7372 `__ (by
`@yarikoptic `__)
+.. _tests-1:
+
Tests
-----
@@ -22,7 +111,7 @@ Tests
`PR #7372 `__ (by
`@yarikoptic `__)
-.. _section-1:
+.. _section-3:
0.19.0 (2023-06-14)
===================
@@ -43,6 +132,8 @@ Enhancements and New Features
`@jsheunis `__ and
`@adswa `__)
+.. _bug-fixes-2:
+
Bug Fixes
---------
@@ -62,6 +153,8 @@ Dependencies
#7330 `__ (by
`@mslw `__)
+.. _documentation-2:
+
Documentation
-------------
@@ -69,7 +162,7 @@ Documentation
#7310 `__ (by
`@jsheunis `__)
-.. _tests-1:
+.. _tests-2:
Tests
-----
@@ -79,12 +172,12 @@ Tests
#7261 `__ (by
`@yarikoptic `__)
-.. _section-2:
+.. _section-4:
0.18.5 (2023-06-13)
===================
-.. _bug-fixes-1:
+.. _bug-fixes-3:
Bug Fixes
---------
@@ -107,7 +200,7 @@ Bug Fixes
#7418 `__ (by
`@yarikoptic `__)
-.. _documentation-1:
+.. _documentation-3:
Documentation
-------------
@@ -117,7 +210,7 @@ Documentation
#7412 `__ (by
`@jwodder `__)
-.. _internal-1:
+.. _internal-2:
Internal
--------
@@ -128,7 +221,7 @@ Internal
#7392 `__ (by
`@yarikoptic `__)
-.. _tests-2:
+.. _tests-3:
Tests
-----
@@ -142,12 +235,12 @@ Tests
#7422 `__ (by
`@yarikoptic `__)
-.. _section-3:
+.. _section-5:
0.18.4 (2023-05-16)
===================
-.. _bug-fixes-2:
+.. _bug-fixes-4:
Bug Fixes
---------
@@ -158,7 +251,7 @@ Bug Fixes
#7357 `__ (by
`@bpoldrack `__)
-.. _documentation-2:
+.. _documentation-4:
Documentation
-------------
@@ -169,7 +262,7 @@ Documentation
#7385 `__ (by
`@mslw `__)
-.. _internal-2:
+.. _internal-3:
Internal
--------
@@ -178,7 +271,7 @@ Internal
#7341 `__ (by
`@jwodder `__)
-.. _tests-3:
+.. _tests-4:
Tests
-----
@@ -192,12 +285,12 @@ Tests
snapshots.d.o
- use specific miniconda installer for py 3.7.
-.. _section-4:
+.. _section-6:
0.18.3 (2023-03-25)
===================
-.. _bug-fixes-3:
+.. _bug-fixes-5:
Bug Fixes
---------
@@ -241,7 +334,7 @@ Bug Fixes
#7355 `__ (by
`@yarikoptic `__)
-.. _documentation-3:
+.. _documentation-5:
Documentation
-------------
@@ -251,7 +344,7 @@ Documentation
#7289 `__ (by
`@mslw `__)
-.. _internal-3:
+.. _internal-4:
Internal
--------
@@ -281,7 +374,7 @@ Internal
#7339 `__ (by
`@jwodder `__)
-.. _tests-4:
+.. _tests-5:
Tests
-----
@@ -300,12 +393,12 @@ Tests
#7353 `__ (by
`@yarikoptic `__)
-.. _section-5:
+.. _section-7:
0.18.2 (2023-02-27)
===================
-.. _bug-fixes-4:
+.. _bug-fixes-6:
Bug Fixes
---------
@@ -337,7 +430,7 @@ Dependencies
#7263 `__ (by
`@yarikoptic `__)
-.. _internal-4:
+.. _internal-5:
Internal
--------
@@ -346,7 +439,7 @@ Internal
tox.ini. `PR #7271 `__
(by `@yarikoptic `__)
-.. _tests-5:
+.. _tests-6:
Tests
-----
@@ -357,12 +450,12 @@ Tests
#7260 `__ (by
`@yarikoptic `__)
-.. _section-6:
+.. _section-8:
0.18.1 (2023-01-16)
===================
-.. _bug-fixes-5:
+.. _bug-fixes-7:
Bug Fixes
---------
@@ -374,7 +467,7 @@ Bug Fixes
#7249 `__ (by
`@bpoldrack `__)
-.. _documentation-4:
+.. _documentation-6:
Documentation
-------------
@@ -392,7 +485,7 @@ Performance
#7250 `__ (by
`@bpoldrack `__)
-.. _section-7:
+.. _section-9:
0.18.0 (2022-12-31)
===================
@@ -467,7 +560,7 @@ Enhancements and New Features
#7235 `__ (by
`@bpoldrack `__)
-.. _bug-fixes-6:
+.. _bug-fixes-8:
Bug Fixes
---------
@@ -479,7 +572,7 @@ Bug Fixes
#6952 `__ (by
`@adswa `__)
-.. _documentation-5:
+.. _documentation-7:
Documentation
-------------
@@ -514,7 +607,7 @@ Documentation
#7204 `__ (by
`@yarikoptic `__)
-.. _internal-5:
+.. _internal-6:
Internal
--------
@@ -584,7 +677,7 @@ Performance
#7230 `__ (by
`@yarikoptic `__)
-.. _tests-6:
+.. _tests-7:
Tests
-----
@@ -598,7 +691,7 @@ Tests
`PR #7176 `__ (by
`@adswa `__)
-.. _section-8:
+.. _section-10:
0.17.10 (2022-12-14)
====================
@@ -622,7 +715,7 @@ Enhancements and New Features
#7210 `__ (by
`@bpoldrack `__)
-.. _bug-fixes-7:
+.. _bug-fixes-9:
Bug Fixes
---------
@@ -680,7 +773,7 @@ Bug Fixes
#7226 `__ (by
`@bpoldrack `__)
-.. _documentation-6:
+.. _documentation-8:
Documentation
-------------
@@ -691,7 +784,7 @@ Documentation
#7155 `__ (by
`@bpoldrack `__)
-.. _internal-6:
+.. _internal-7:
Internal
--------
@@ -707,7 +800,7 @@ Internal
#7161 `__ (by
`@bpoldrack `__)
-.. _tests-7:
+.. _tests-8:
Tests
-----
@@ -729,12 +822,12 @@ Tests
#7209 `__ (by
`@bpoldrack `__)
-.. _section-9:
+.. _section-11:
0.17.9 (2022-11-07)
===================
-.. _bug-fixes-8:
+.. _bug-fixes-10:
Bug Fixes
---------
@@ -774,7 +867,7 @@ Dependencies
#7136 `__ (by
`@bpoldrack `__)
-.. _internal-7:
+.. _internal-8:
Internal
--------
@@ -790,7 +883,7 @@ Internal
#7118 `__ (by
`@yarikoptic `__)
-.. _tests-8:
+.. _tests-9:
Tests
-----
@@ -810,12 +903,12 @@ Tests
#7130 `__ (by
`@yarikoptic `__)
-.. _section-10:
+.. _section-12:
0.17.8 (2022-10-24)
===================
-.. _bug-fixes-9:
+.. _bug-fixes-11:
Bug Fixes
---------
@@ -857,12 +950,12 @@ Bug Fixes
#7103 `__ (by
`@mslw `__)
-.. _section-11:
+.. _section-13:
0.17.7 (2022-10-14)
===================
-.. _bug-fixes-10:
+.. _bug-fixes-12:
Bug Fixes
---------
@@ -881,7 +974,7 @@ Bug Fixes
`PR #7075 `__ (by
`@yarikoptic `__)
-.. _internal-8:
+.. _internal-9:
Internal
--------
@@ -909,7 +1002,7 @@ Internal
#7082 `__ (by
`@jwodder `__)
-.. _tests-9:
+.. _tests-10:
Tests
-----
@@ -918,12 +1011,12 @@ Tests
pass. `PR #7002 `__ (by
`@bpoldrack `__)
-.. _section-12:
+.. _section-14:
0.17.6 (2022-09-21)
===================
-.. _bug-fixes-11:
+.. _bug-fixes-13:
Bug Fixes
---------
@@ -955,7 +1048,7 @@ Bug Fixes
`PR #7049 `__ (by
`@yarikoptic `__)
-.. _internal-9:
+.. _internal-10:
Internal
--------
@@ -968,7 +1061,7 @@ Internal
#7024 `__ (by
`@jwodder `__)
-.. _tests-10:
+.. _tests-11:
Tests
-----
@@ -1042,7 +1135,7 @@ Bug Fix
`#6978 `__
(`@christian-monch `__)
-.. _tests-11:
+.. _tests-12:
Tests
-----
@@ -1207,7 +1300,7 @@ Pushed to ``maint``
- DOC: fix capitalization of service names
(`@aqw `__)
-.. _tests-12:
+.. _tests-13:
Tests
-----
@@ -1366,7 +1459,7 @@ Deprecations and removals
`#6273 `__ (by
@jwodder)
-.. _bug-fixes-12:
+.. _bug-fixes-14:
Bug Fixes
---------
@@ -1386,7 +1479,7 @@ Bug Fixes
prior recording a new dataset state. Fixes #4967
`#6795 `__ (by @mih)
-.. _documentation-7:
+.. _documentation-9:
Documentation
-------------
@@ -1400,7 +1493,7 @@ Documentation
``addurls``. `#6684 `__
(by @jdkent)
-.. _internal-10:
+.. _internal-11:
Internal
--------
@@ -2014,7 +2107,7 @@ Deprecations and removals
commands. `#6564 `__
(by @mih)
-.. _bug-fixes-13:
+.. _bug-fixes-15:
Bug Fixes
---------
@@ -2137,7 +2230,7 @@ Bug Fixes
`#6603 `__ (by
@yarikoptic)
-.. _documentation-8:
+.. _documentation-10:
Documentation
-------------
@@ -2198,7 +2291,7 @@ Documentation
now `#6436 `__ (by
@yarikoptic)
-.. _internal-11:
+.. _internal-12:
Internal
--------
@@ -2310,7 +2403,7 @@ Internal
previous implementations.
`#6591 `__ (by @mih)
-.. _tests-13:
+.. _tests-14:
Tests
-----
@@ -2556,7 +2649,7 @@ Bug Fix
`#6140 `__
(`@bpoldrack `__)
-.. _tests-14:
+.. _tests-15:
Tests
-----
@@ -2628,7 +2721,7 @@ Pushed to ``maint``
- CI: Enable new codecov uploader in Appveyor CI
(`@adswa `__)
-.. _internal-12:
+.. _internal-13:
Internal
--------
@@ -2644,7 +2737,7 @@ Internal
`#6072 `__
(`@jwodder `__)
-.. _documentation-9:
+.. _documentation-11:
Documentation
-------------
@@ -2653,7 +2746,7 @@ Documentation
`#6065 `__
(`@mih `__)
-.. _tests-15:
+.. _tests-16:
Tests
-----
@@ -2722,7 +2815,7 @@ Bug Fix
`#6007 `__
(`@mih `__)
-.. _tests-16:
+.. _tests-17:
Tests
-----
@@ -2784,7 +2877,7 @@ Pushed to ``maint``
- Discontinue testing of hirni extension
(`@mih `__)
-.. _internal-13:
+.. _internal-14:
Internal
--------
@@ -2793,7 +2886,7 @@ Internal
`#5980 `__
(`@jwodder `__)
-.. _documentation-10:
+.. _documentation-12:
Documentation
-------------
@@ -2802,7 +2895,7 @@ Documentation
`#5998 `__
(`@mih `__)
-.. _tests-17:
+.. _tests-18:
Tests
-----
@@ -3158,7 +3251,7 @@ Fixes
``annex get`` and ``annex copy`` calls.
(`#5904 `__)
-.. _tests-18:
+.. _tests-19:
Tests
-----
@@ -3229,7 +3322,7 @@ Pushed to ``maint``
- RF(BF?)+DOC: provide User-Agent to entire session headers + use those
if provided (`@yarikoptic `__)
-.. _internal-14:
+.. _internal-15:
Internal
--------
@@ -3250,7 +3343,7 @@ Internal
(`@adswa `__
`@yarikoptic `__)
-.. _tests-19:
+.. _tests-20:
Tests
-----
@@ -3319,7 +3412,7 @@ Bug Fix
`#5776 `__
(s.heunis@fz-juelich.de)
-.. _internal-15:
+.. _internal-16:
Internal
--------
@@ -3331,7 +3424,7 @@ Internal
available `#5818 `__
(`@yarikoptic `__)
-.. _tests-20:
+.. _tests-21:
Tests
-----
@@ -3359,7 +3452,7 @@ Authors: 4
0.14.6 (Sun Jun 27 2021)
========================
-.. _internal-16:
+.. _internal-17:
Internal
--------
@@ -3470,7 +3563,7 @@ Pushed to ``maint``
- MNT: Post-release dance (`@kyleam `__)
-.. _internal-17:
+.. _internal-18:
Internal
--------
@@ -3483,7 +3576,7 @@ Internal
`#5649 `__
(`@kyleam `__)
-.. _tests-21:
+.. _tests-22:
Tests
-----
diff --git a/docs/source/design/result_records.rst b/docs/source/design/result_records.rst
index 19fe4adbb6..a9e7ddf2e1 100644
--- a/docs/source/design/result_records.rst
+++ b/docs/source/design/result_records.rst
@@ -177,8 +177,10 @@ A string label categorizing the state of an entity. Common values are:
``error_message``
-----------------
-List of any error messages that were captured or produced while achieving a
-result.
+An error message that was captured or produced while achieving a result.
+
+An error message can be a string or a tuple. In the case of a tuple, the
+second item can contain values for ``%``-expansion of the message string.
``exception``
diff --git a/setup.py b/setup.py
index b0becce7bf..0f211bad61 100755
--- a/setup.py
+++ b/setup.py
@@ -29,6 +29,7 @@
'colorama; platform_system=="Windows"',
'distro; python_version >= "3.8"',
'importlib-metadata >=3.6; python_version < "3.10"',
+ 'importlib-resources >= 3.0; python_version < "3.9"',
'iso8601',
'humanize',
'fasteners>=0.14',
diff --git a/tox.ini b/tox.ini
index 2cafed99e9..f409ebc4d8 100644
--- a/tox.ini
+++ b/tox.ini
@@ -22,7 +22,7 @@ deps =
codespell~=2.0
pylint~=2.15
commands =
- codespell
+ codespell
# pylinting limited set of known obvious issues only
pylint -d all -e W1202 datalad setup.py
@@ -41,6 +41,7 @@ commands =
datalad/api.py \
datalad/cmd.py \
datalad/downloaders/providers.py \
+ datalad/interface/results.py \
datalad/runner \
datalad/support/annex_utils.py \
datalad/support/ansi_colors.py \