Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update demo notebook 3 #2

Merged
merged 7 commits into from
Sep 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
[html]
directory = coverage

[run]
source =
instageo
omit =
*/__init__.py
*_test.py

[report]
omit =
__init__.py
*_test.py
*app.py

exclude_lines =
# Don't complain if tests don't hit defensive assertion code:
raise AssertionError
raise NotImplementedError

# Don't complain if non-runnable code isn't run:
if 0:
if __name__ == .__main__.:
37 changes: 37 additions & 0 deletions .github/workflows/tests_and_linters.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Tests and Linters 🧪

on: [push, pull_request]

jobs:
tests-and-linters:
name: "Python 3.10 on Ubuntu Latest"
runs-on: ubuntu-latest

steps:
- name: Install dependencies for viewer test
run: sudo apt-get update && sudo apt-get install -y xvfb
- name: Checkout your repo 📦
uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.10"

- name: Install python dependencies 🔧
run: |
pip install --upgrade pip
pip install -r requirements.txt \
-r instageo/model/requirements.txt \
-r instageo/data/requirements.txt \
-r instageo/apps/requirements.txt

- name: Run linters 🖌️
run: pre-commit run --all-files --verbose
- name: Set PYTHONPATH
run: echo "PYTHONPATH=$PYTHONPATH:$(pwd)" >> $GITHUB_ENV
- name: Run tests 🧪
env:
EARTHDATA_USERNAME: ${{ secrets.EARTHDATA_USERNAME }}
EARTHDATA_PASSWORD: ${{ secrets.EARTHDATA_PASSWORD }}
run: pytest --cov --cov-config=.coveragerc --cov-report=html --cov-report=term-missing --cov-fail-under=50 -m "not auth"
# - name: Test build docs 📖
# run: mkdocs build --verbose --site-dir docs_public
66 changes: 25 additions & 41 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
Expand Down Expand Up @@ -49,7 +50,6 @@ coverage.xml
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
Expand All @@ -72,7 +72,6 @@ instance/
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
Expand All @@ -83,35 +82,9 @@ profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock

# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
.python-version

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
Expand Down Expand Up @@ -148,15 +121,26 @@ dmypy.json
# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/
# PyCharm stuff
.idea/

# VSCode stuff
.vscode

# Docker stuff
.devcontainer

# Cython debug symbols
cython_debug/
# MacBook Finder
.DS_Store

# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
# Experiment results
outputs/


*.pyc
__pycache__/
*lightning_logs
*prithvi/
nul/
notebooks/
!notebooks/*.ipynb
88 changes: 88 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# Pre-commit hooks for repo.

# Packages:
# pre-commit: General pre-commits for formatting.
# black: Python code strict formatting.
# pyupgrade: Upgrade syntax for newer versions of the language.
# isort: Sorts imports.
# flake8: Checks code follows PEP8 standard.
# mypy: Static typing.
# conventional-pre-commit: commit format checker.
# blacken-docs: Checks docs follow black format standard.
# pydocstyle: Checking docstring style.

default_stages: ["commit", "commit-msg", "push"]
default_language_version:
python: python3.10

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
exclude_types: [image]
- id: check-merge-conflict
- id: debug-statements
- id: mixed-line-ending
- id: check-yaml

- repo: https://github.com/psf/black
rev: 23.11.0
hooks:
- id: black
exclude_types: [image]
language_version: python3

- repo: https://github.com/asottile/pyupgrade
rev: v3.15.0
hooks:
- id: pyupgrade

- repo: https://github.com/timothycrosley/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black", "--filter-files"]

- repo: https://github.com/PyCQA/flake8
rev: 6.1.0
hooks:
- id: flake8
args:
[
"--max-line-length=100",
"--extend-ignore=E203,BLK100",
"--exclude=*tests*",
]

- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.7.0
hooks:
- id: mypy
exclude: ^docs/|test
args: [--config-file=setup.cfg]

- repo: https://github.com/compilerla/conventional-pre-commit
rev: v3.0.0
hooks:
- id: conventional-pre-commit
stages: [commit-msg]

- repo: https://github.com/asottile/blacken-docs
rev: 1.16.0
hooks:
- id: blacken-docs
additional_dependencies: [black>=22.1.0]
language_version: python3

- repo: https://github.com/pycqa/pydocstyle
rev: 6.3.0
hooks:
- id: pydocstyle
name: Checking docstring style.
args:
[
"--convention=google",
"--match=^((?!test).)*$",
]
11 changes: 5 additions & 6 deletions instageo/model/dataloader.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,9 +123,8 @@ def process_and_augment(
label = None
# convert to PIL for easier transforms
ims = [Image.fromarray(im) for im in ims]
if not (y is None):
label = y.copy()
label = Image.fromarray(label.squeeze())
if y is not None:
label = Image.fromarray(y.copy().squeeze())
if train:
ims, label = random_crop_and_flip(ims, label, im_size)
ims, label = normalize_and_convert_to_tensor(ims, label, mean, std, temporal_size)
Expand Down Expand Up @@ -245,8 +244,8 @@ def get_raster_data(
data = src.read()
if (not is_label) and bands:
data = data[bands, ...]
# For some reasons, some few HLS tiles are not scaled. In the following lines,
# we find and scale them
# For some reasons, some few HLS tiles are not scaled in v2.0.
# In the following lines, we find and scale them
bands = []
for band in data:
if band.max() > 10:
Expand Down Expand Up @@ -342,7 +341,7 @@ def __init__(
constant_multiplier: float,
bands: List[int] | None = None,
):
"""Dataset Class for loading and preprocessing Sentinel11Floods dataset.
"""Dataset Class for loading and preprocessing the dataset.

Args:
filename (str): Filename of the CSV file containing data paths.
Expand Down
1 change: 1 addition & 0 deletions instageo/model/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@ einops
tensorboard
omegaconf
hydra-core
torchvision
24 changes: 18 additions & 6 deletions instageo/model/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,23 @@ def get_device() -> str:
return device


def custom_collate_fn(batch: tuple[torch.Tensor]) -> tuple[torch.Tensor, torch.Tensor]:
"""Test DataLoader Collate Function.

This function is a convenient wrapper around the PyTorch DataLoader class,
allowing for easy setup of various DataLoader parameters.

Args:
batch (Tuple[Tensor]): A list of tuples containing features and labels.

Returns:
Tuple of (x,y) concatenated into separate tensors
"""
data = torch.cat([a[0] for a in batch], 0)
labels = torch.cat([a[1] for a in batch], 0)
return data, labels


def create_dataloader(
dataset: Dataset,
batch_size: int,
Expand Down Expand Up @@ -522,12 +539,7 @@ def main(cfg: DictConfig) -> None:
constant_multiplier=cfg.dataloader.constant_multiplier,
)
test_loader = create_dataloader(
test_dataset,
batch_size=batch_size,
collate_fn=lambda x: (
torch.cat([a[0] for a in x], 0),
torch.cat([a[1] for a in x], 0),
),
test_dataset, batch_size=batch_size, collate_fn=custom_collate_fn
)
model = PrithviSegmentationModule.load_from_checkpoint(
checkpoint_path,
Expand Down
Loading
Loading