Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-45657 Add tests for consdb-hinfo and consdb-pqserver +200-65 #33

Merged
merged 8 commits into from
Aug 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 6 additions & 20 deletions .github/workflows/pytest.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,25 +14,11 @@ jobs:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
cache: "pip"
- name: Build docker image
run: docker build -f Dockerfile.pytest -t pytest_image .
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering if this could be done in a container in the workflow: https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions#jobsjob_idcontainer
instead of building a new container and running in it.

The advantage would be that more of the test code is in the workflow file, where it's more visible, than in another Dockerfile.


- name: Editable mode install
run: |
python -m pip install uv
uv pip install --system pytest pytest-cov pytest-html pytest-asyncio httpx
uv pip install --system -e .
- name: Test with pytest
run: |
pytest --cov=./ --cov-report=html --html=pytest_report.html --self-contained-html

- name: Upload coverage to codecov
uses: codecov/codecov-action@v4
with:
files: ./coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}
- name: Run docker image
run: docker run --rm -e PYTEST_ADDOPTS="--color=yes" pytest_image
11 changes: 8 additions & 3 deletions Dockerfile.hinfo
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
ARG OBS_LSST_VERSION=w_2024_21
FROM lsstsqre/centos:7-stack-lsst_distrib-${OBS_LSST_VERSION}
USER lsst

RUN source loadLSST.bash && mamba install aiokafka httpx
RUN source loadLSST.bash && pip install kafkit
COPY python/lsst/consdb/hinfo.py python/lsst/consdb/utils.py ./hinfo/
RUN source loadLSST.bash && pip install kafkit aiokafka httpx
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We generally try to avoid pip installing into conda environments whenever possible. Although it works pretty well nowadays, there's still some iffiness about the integration.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to do this using only mamba, but it wouldn't cooperate. I think it's because we have to rely on newer versions of aiokafka that have not made their way to conda. My understanding is generally you can get away with mixing pip and conda as long as you never use conda again once you've used pip.


WORKDIR /home/lsst/
COPY --chown=lsst . ./consdb/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like it copies too much for the hinfo service.

WORKDIR /home/lsst/consdb/
RUN source /opt/lsst/software/stack/loadLSST.bash && pip install -e .
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure why this as well as the install above.


# Environment variables that must be set:
# INSTRUMENT: LATISS, LSSTComCam, LSSTCam
Expand All @@ -16,4 +21,4 @@ COPY python/lsst/consdb/hinfo.py python/lsst/consdb/utils.py ./hinfo/
# KAFKA_GROUP_ID: name of consumer group, default is "consdb-consumer"
# KAFKA_USERNAME: username for SASL_PLAIN authentication, default is "consdb"

ENTRYPOINT [ "bash", "-c", "source loadLSST.bash; setup obs_lsst; python ./hinfo/hinfo.py" ]
ENTRYPOINT [ "bash", "-c", "source /opt/lsst/software/stack/loadLSST.bash; setup obs_lsst; python -m lsst.consdb.hinfo" ]
12 changes: 12 additions & 0 deletions Dockerfile.pytest
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
ARG OBS_LSST_VERSION=w_2024_21
FROM lsstsqre/centos:7-stack-lsst_distrib-${OBS_LSST_VERSION}
USER lsst
RUN source loadLSST.bash && mamba install aiokafka httpx
RUN source loadLSST.bash && pip install kafkit aiokafka httpx pytest-asyncio

WORKDIR /home/lsst/
COPY --chown=lsst . ./consdb/
WORKDIR /home/lsst/consdb/
RUN source /opt/lsst/software/stack/loadLSST.bash && pip install -e .

ENTRYPOINT [ "/bin/bash", "-c", "source /opt/lsst/software/stack/loadLSST.bash; setup obs_lsst; pytest ." ]
27 changes: 18 additions & 9 deletions python/lsst/consdb/hinfo.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@
from lsst.resources import ResourcePath
from sqlalchemy import MetaData, Table
from sqlalchemy.dialects.postgresql import insert
from utils import setup_logging, setup_postgres

from .utils import setup_logging, setup_postgres

if TYPE_CHECKING:
import lsst.afw.cameraGeom # type: ignore
Expand Down Expand Up @@ -442,14 +443,14 @@ def get_kafka_config() -> KafkaConfig:

logger = setup_logging("consdb.hinfo")

instrument = os.environ["INSTRUMENT"]
logger.info(f"Instrument = {instrument}")
instrument = os.environ.get("INSTRUMENT", "")
logger.info(f"{instrument=}")
bucket_prefix = os.environ.get("BUCKET_PREFIX", "")
if bucket_prefix:
os.environ["LSST_DISABLE_BUCKET_VALIDATION"] = "1"


engine = setup_postgres()
engine = None


@dataclass
Expand Down Expand Up @@ -480,10 +481,7 @@ def __init__(self, instrument_name, translator, instrument_mapping, det_mapping,
#################


async def main() -> None:
"""Handle Header Service largeFileObjectAvailable messages."""
global logger, instrument, bucket_prefix, TOPIC_MAPPING

def get_instrument_dict(instrument: str) -> dict:
if instrument == "LATISS":
instrument_dict = {
"O": Instrument(
Expand Down Expand Up @@ -531,6 +529,15 @@ async def main() -> None:
else:
raise ValueError("Unrecognized instrument: {instrument}")

return instrument_dict


async def main() -> None:
"""Handle Header Service largeFileObjectAvailable messages."""
global logger, instrument, bucket_prefix, TOPIC_MAPPING

instrument_dict = get_instrument_dict(instrument)

topic = f"lsst.sal.{TOPIC_MAPPING[instrument]}.logevent_largeFileObjectAvailable"
kafka_config = get_kafka_config()
async with httpx.AsyncClient() as client:
Expand Down Expand Up @@ -570,4 +577,6 @@ async def main() -> None:
await consumer.stop()


asyncio.run(main())
if __name__ == "__main__":
engine = setup_postgres()
asyncio.run(main())
Loading
Loading