Skip to content

Commit

Permalink
Merge pull request #142 from euroargodev/mat-owc
Browse files Browse the repository at this point in the history
API to create source file and data for OWC software
  • Loading branch information
gmaze authored Jan 14, 2022
2 parents 29164b3 + 4d915d9 commit b5dba6d
Show file tree
Hide file tree
Showing 27 changed files with 1,929 additions and 462 deletions.
5 changes: 5 additions & 0 deletions .codespellrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[codespell]
skip = *.nc,*.ipynb,./local_work,./float_source,./binder,./.github,*.log,./.git,./docs/_build,./docs/_static
count =
quiet-level = 3
ignore-words-list = PRES, pres
4 changes: 2 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -189,10 +189,10 @@ fabric.properties
# Android studio 3.1+ serialized cache file
.idea/caches/build_file_checksums.ser

#pytest quai20
#pytest and misc
.vscode/
.pytest_cache
knotebooks/
argopy/tests/cov.xml
argopy/tests/dummy_fileA.txt
float_source
float_source/
24 changes: 13 additions & 11 deletions HOW_TO_RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,30 @@
1. [ ] Make sure that all CI tests are passed with **free* environments
1. [ ] Run codespell ``codespell -q 3``

2. [ ] Update ``./requirements.txt`` and ``./docs/requirements.txt`` with CI free environments dependencies versions
2. [ ] Make sure that all CI tests are passed with **free* environments

3. [ ] Update ``./ci/requirements/py*-dev.yml`` with last free environments dependencies versions
3. [ ] Update ``./requirements.txt`` and ``./docs/requirements.txt`` with CI free environments dependencies versions

4. [ ] Make sure that all CI tests are passed with **dev* environments
4. [ ] Update ``./ci/requirements/py*-dev.yml`` with last free environments dependencies versions

5. [ ] Increase release version in ``./setup.py`` file
5. [ ] Make sure that all CI tests are passed with **dev* environments

6. [ ] Update date and release version in ``./docs/whats-new.rst``
6. [ ] Increase release version in ``./setup.py`` file

7. [ ] On the master branch, commit the release in git:
7. [ ] Update date and release version in ``./docs/whats-new.rst``

8. [ ] On the master branch, commit the release in git:

```git commit -a -m 'Release v0.X.Y'```

8. [ ] Tag the release:
9. [ ] Tag the release:

```git tag -a v0.X.Y -m 'v0.X.Y'```

9. [ ] Push it online:
10. [ ] Push it online:

```git push origin v0.X.Y```
```git push origin v0.X.Y```

10. [ ] Issue the release on GitHub. Click on "Draft a new release" at
11. [ ] Issue the release on GitHub. Click on "Draft a new release" at
https://github.com/euroargodev/argopy/releases. Type in the version number, but
don't bother to describe it -- we maintain that on the docs instead.

Expand Down
25 changes: 6 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,11 @@
## Install


Install the last release with pip:
Install the last release with conda:
```bash
conda install -c conda-forge argopy
```
or pip:
```bash
pip install argopy
```
Expand Down Expand Up @@ -113,21 +117,4 @@ See the [documentation page for more examples](https://argopy.readthedocs.io/en/

## Development roadmap

Our next big steps:
- [ ] To provide Bio-geochemical variables ([#22](https://github.com/euroargodev/argopy/issues/22), [#77](https://github.com/euroargodev/argopy/issues/77), [#81](https://github.com/euroargodev/argopy/issues/81))
- [ ] To develop expert methods related to Quality Control of the data with other python softwares like:
- [ ] [pyowc](https://github.com/euroargodev/argodmqc_owc): [#33](https://github.com/euroargodev/argodmqc_owc/issues/33), [#53](https://github.com/euroargodev/argodmqc_owc/issues/53)
- [ ] [bgcArgoDMQC](https://github.com/ArgoCanada/bgcArgoDMQC): [#37](https://github.com/ArgoCanada/bgcArgoDMQC/issues/37)

We aim to provide high level helper methods to load Argo data and meta-data from:
- [x] Ifremer erddap
- [x] local copy of the GDAC ftp folder
- [x] Index files (local and online)
- [x] Argovis
- [ ] Online GDAC ftp

We also aim to provide high level helper methods to visualise and plot Argo data and meta-data:
- [x] Map with trajectories
- [x] Histograms for meta-data
- [ ] Waterfall plots
- [ ] T/S diagram
See milestone here: https://github.com/euroargodev/argopy/milestone/3
37 changes: 21 additions & 16 deletions argopy/options.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,32 +65,37 @@ def validate_ftp(this_path):


class set_options:
"""Set options for argopy.
"""Set options for argopy
List of options:
- `dataset`: Define the Dataset to work with.
Default: `phy`. Possible values: `phy`, `bgc` or `ref`.
- `src`: Source of fetched data.
Default: `erddap`. Possible values: `erddap`, `localftp`, `argovis`
- `local_ftp`: Absolute path to a local GDAC ftp copy.
Default: `.`
- `cachedir`: Absolute path to a local cache directory.
Default: `~/.cache/argopy`
- `mode`: User mode.
Default: `standard`. Possible values: `standard` or `expert`.
- `api_timeout`: Define the time out of internet requests to web API, in seconds.
Default: 60
- `trust_env`: Allow for local environment variables to be used by fsspec to connect to the internet. Get
proxies information from HTTP_PROXY / HTTPS_PROXY environment variables if this option is True (False by
default). Also can get proxy credentials from ~/.netrc file if present.
- ``dataset``: Define the Dataset to work with.
Default: ``phy``.
Possible values: ``phy``, ``bgc`` or ``ref``.
- ``src``: Source of fetched data.
Default: ``erddap``.
Possible values: ``erddap``, ``localftp``, ``argovis``
- ``local_ftp``: Absolute path to a local GDAC ftp copy.
Default: None
- ``cachedir``: Absolute path to a local cache directory.
Default: ``~/.cache/argopy``
- ``mode``: User mode.
Default: ``standard``.
Possible values: ``standard`` or ``expert``.
- ``api_timeout``: Define the time out of internet requests to web API, in seconds.
Default: 60
- ``trust_env``: Allow for local environment variables to be used by fsspec to connect to the internet.
Get proxies information from HTTP_PROXY / HTTPS_PROXY environment variables if this option is True (
False by default). Also can get proxy credentials from ~/.netrc file if present.
You can use `set_options` either as a context manager:
>>> import argopy
>>> with argopy.set_options(src='localftp'):
>>> ds = argopy.DataFetcher().float(3901530).to_xarray()
Or to set global options:
>>> argopy.set_options(src='localftp')
"""
Expand Down
2 changes: 1 addition & 1 deletion argopy/tests/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ def test_wrapper(fix):
pass
except ServerDisconnectedError as e:
# We can't do anything about this !
warnings.warn("\nWe were disconnected from server !\n%s" % str(e.args))
warnings.warn("\n We were disconnected from server !\n%s" % str(e.args))
pass
except ClientResponseError as e:
# The server is sending back an error when creating the response
Expand Down
28 changes: 28 additions & 0 deletions argopy/tests/test_utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@
format_oneline, is_indexbox,
check_wmo, is_wmo,
wmo2box,
modified_environ,
wrap_longitude,
toYearFraction, YearFraction_to_datetime,
TopoFetcher
)
from argopy.errors import InvalidFetcherAccessPoint, FtpPathError
Expand Down Expand Up @@ -487,6 +490,14 @@ def test_check_wmo():
assert check_wmo(np.array((12345, 1234567), dtype='int')) == [12345, 1234567]


def test_modified_environ():
os.environ["DUMMY_ENV_ARGOPY"] = 'initial'
with modified_environ(DUMMY_ENV_ARGOPY='toto'):
assert os.environ['DUMMY_ENV_ARGOPY'] == 'toto'
assert os.environ['DUMMY_ENV_ARGOPY'] == 'initial'
os.environ.pop('DUMMY_ENV_ARGOPY')


def test_wmo2box():
with pytest.raises(ValueError):
wmo2box(12)
Expand All @@ -507,6 +518,23 @@ def complete_box(b):
assert is_box(complete_box(wmo2box(7501)))


def test_wrap_longitude():
assert wrap_longitude(np.array([-20])) == 340
assert wrap_longitude(np.array([40])) == 40
assert np.all(np.equal(wrap_longitude(np.array([340, 20])), np.array([340, 380])))


def test_toYearFraction():
assert toYearFraction(pd.to_datetime('202001010000')) == 2020
assert toYearFraction(pd.to_datetime('202001010000', utc=True)) == 2020
assert toYearFraction(pd.to_datetime('202001010000')+pd.offsets.DateOffset(years=1)) == 2021


def test_YearFraction_to_datetime():
assert YearFraction_to_datetime(2020) == pd.to_datetime('202001010000')
assert YearFraction_to_datetime(2020+1) == pd.to_datetime('202101010000')


@requires_connection
def test_TopoFetcher():
box = [81, 123, -67, -54]
Expand Down
115 changes: 106 additions & 9 deletions argopy/tests/test_xarray.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
import os
import pytest
import warnings
import numpy as np
import tempfile
import xarray as xr

import argopy
from argopy import DataFetcher as ArgoDataFetcher
from argopy.errors import InvalidDatasetStructure
from . import requires_connected_erddap_phy
from argopy.errors import InvalidDatasetStructure, OptionValueError
from . import requires_connected_erddap_phy, requires_localftp


@pytest.fixture(scope="module")
Expand All @@ -19,7 +24,8 @@ def ds_pts():
data[user_mode] = (
ArgoDataFetcher(src="erddap", mode=user_mode)
.region([-75, -55, 30.0, 40.0, 0, 100.0, "2011-01-01", "2011-01-15"])
.to_xarray()
.load()
.data
)
except Exception as e:
warnings.warn("Error when fetching tests data: %s" % str(e.args))
Expand Down Expand Up @@ -60,12 +66,6 @@ def test_interpolation_expert(self, ds_pts):
ds = ds_pts["expert"].argo.point2profile()
assert "PRES_INTERPOLATED" in ds.argo.interp_std_levels([20, 30, 40, 50]).dims

def test_points_error(self, ds_pts):
"""Try to interpolate points, not profiles"""
ds = ds_pts["standard"]
with pytest.raises(InvalidDatasetStructure):
ds.argo.interp_std_levels([20, 30, 40, 50])

def test_std_error(self, ds_pts):
"""Try to interpolate on a wrong axis"""
ds = ds_pts["standard"].argo.point2profile()
Expand All @@ -77,6 +77,48 @@ def test_std_error(self, ds_pts):
ds.argo.interp_std_levels(12)


@requires_connected_erddap_phy
class Test_groupby_pressure_bins:
def test_groupby_ds_type(self, ds_pts):
"""Run with success for standard/expert mode and point/profile"""
for user_mode, this in ds_pts.items():
for format in ["point", "profile"]:
if format == 'profile':
that = this.argo.point2profile()
else:
that = this.copy()
bins = np.arange(0.0, np.max(that["PRES"]) + 10.0, 10.0)
assert "STD_PRES_BINS" in that.argo.groupby_pressure_bins(bins).coords

def test_bins_error(self, ds_pts):
"""Try to groupby over invalid bins """
ds = ds_pts["standard"]
with pytest.raises(ValueError):
ds.argo.groupby_pressure_bins([100, 20, 30, 40, 50]) # un-sorted
with pytest.raises(ValueError):
ds.argo.groupby_pressure_bins([-20, 20, 30, 40, 50]) # Negative values

def test_axis_error(self, ds_pts):
"""Try to group by using invalid pressure axis """
ds = ds_pts["standard"]
bins = np.arange(0.0, np.max(ds["PRES"]) + 10.0, 10.0)
with pytest.raises(ValueError):
ds.argo.groupby_pressure_bins(bins, axis='invalid')

def test_empty_result(self, ds_pts):
"""Try to groupby over bins without data"""
ds = ds_pts["standard"]
with pytest.warns(Warning):
out = ds.argo.groupby_pressure_bins([10000, 20000])
assert out == None

def test_all_select(self, ds_pts):
ds = ds_pts["standard"]
bins = np.arange(0.0, np.max(ds["PRES"]) + 10.0, 10.0)
for select in ["shallow", "deep", "middle", "random", "min", "max", "mean", "median"]:
assert "STD_PRES_BINS" in ds.argo.groupby_pressure_bins(bins).coords


@requires_connected_erddap_phy
class Test_teos10:
def test_teos10_variables_default(self, ds_pts):
Expand Down Expand Up @@ -126,3 +168,58 @@ def test_teos10_invalid_variable(self, ds_pts):
that = that.argo.point2profile()
with pytest.raises(ValueError):
that.argo.teos10(["InvalidVariable"])


@requires_localftp
class Test_create_float_source:
local_ftp = argopy.tutorial.open_dataset("localftp")[0]

def is_valid_mdata(self, this_mdata):
"""Validate structure of the output dataset """
check = []
# Check for dimensions:
check.append(argopy.utilities.is_list_equal(['m', 'n'], list(this_mdata.dims)))
# Check for coordinates:
check.append(argopy.utilities.is_list_equal(['m', 'n'], list(this_mdata.coords)))
# Check for data variables:
check.append(np.all(
[v in this_mdata.data_vars for v in ['PRES', 'TEMP', 'PTMP', 'SAL', 'DATES', 'LAT', 'LONG', 'PROFILE_NO']]))
check.append(np.all(
[argopy.utilities.is_list_equal(['n'], this_mdata[v].dims) for v in ['LONG', 'LAT', 'DATES', 'PROFILE_NO']
if v in this_mdata.data_vars]))
check.append(np.all(
[argopy.utilities.is_list_equal(['m', 'n'], this_mdata[v].dims) for v in ['PRES', 'TEMP', 'SAL', 'PTMP'] if
v in this_mdata.data_vars]))
return np.all(check)

def test_error_user_mode(self):
with argopy.set_options(local_ftp=self.local_ftp):
with pytest.raises(InvalidDatasetStructure):
ds = ArgoDataFetcher(src="localftp", mode='standard').float([6901929, 2901623]).load().data
ds.argo.create_float_source()

def test_opt_force(self):
with argopy.set_options(local_ftp=self.local_ftp):
expert_ds = ArgoDataFetcher(src="localftp", mode='expert').float([2901623]).load().data

with pytest.raises(OptionValueError):
expert_ds.argo.create_float_source(force='dummy')

ds_float_source = expert_ds.argo.create_float_source(path=None, force='default')
assert np.all([k in np.unique(expert_ds['PLATFORM_NUMBER']) for k in ds_float_source.keys()])
assert np.all([isinstance(ds_float_source[k], xr.Dataset) for k in ds_float_source.keys()])
assert np.all([self.is_valid_mdata(ds_float_source[k]) for k in ds_float_source.keys()])

ds_float_source = expert_ds.argo.create_float_source(path=None, force='raw')
assert np.all([k in np.unique(expert_ds['PLATFORM_NUMBER']) for k in ds_float_source.keys()])
assert np.all([isinstance(ds_float_source[k], xr.Dataset) for k in ds_float_source.keys()])
assert np.all([self.is_valid_mdata(ds_float_source[k]) for k in ds_float_source.keys()])

def test_filecreate(self):
with argopy.set_options(local_ftp=self.local_ftp):
expert_ds = ArgoDataFetcher(src="localftp", mode='expert').float([6901929, 2901623]).load().data

N_file = len(np.unique(expert_ds['PLATFORM_NUMBER']))
with tempfile.TemporaryDirectory() as folder_output:
expert_ds.argo.create_float_source(path=folder_output)
assert len(os.listdir(folder_output)) == N_file
Loading

0 comments on commit b5dba6d

Please sign in to comment.