Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix CI #741

Closed
wants to merge 30 commits into from
Closed

Fix CI #741

wants to merge 30 commits into from

Conversation

mikemhenry
Copy link
Contributor

Description

Provide a brief description of the PR's purpose here.

Todos

  • Implement feature / fix bug
  • Add tests
  • Update documentation as needed
  • Update changelog to summarize changes in behavior, enhancements, and bugfixes implemented in this PR

Status

  • Ready to go

Changelog message


@ijpulidos
Copy link
Contributor

Locally it's failing for me with scipy 1.14 but not with previous ones. Checking that in CI now.

@ijpulidos
Copy link
Contributor

ijpulidos commented Aug 16, 2024

And now it seems that nose is using the outdated methods in the inspect module, which has been deprecated for a while and removed in 3.11 as far as I can see. Seems like migrating to pytest is going to be required now!

https://github.com/choderalab/openmmtools/actions/runs/10424368899/job/28873034656#step:6:64

@ijpulidos
Copy link
Contributor

Related issue #661

@ijpulidos ijpulidos mentioned this pull request Aug 16, 2024
5 tasks
@mikemhenry
Copy link
Contributor Author

Do we want to support windows? I am also not sure if this is a "real" error or not, but IMHO we should have an except for ZeroDivisionError and set self._timing_data["ns_per_day"] = NaN or something

================================== FAILURES ===================================
______ TestHarmonicOscillatorsSAMSSampler.test_without_unsampled_states _______
[gw2] win32 -- Python 3.10.0 C:\Users\runneradmin\micromamba\envs\openmmtools-test\python.exe

self = <test_sampling.TestHarmonicOscillatorsSAMSSampler object at 0x000001A6B7C13880>

    def test_without_unsampled_states(self):
        """Test multistate sampler on a harmonic oscillator without unsampled endstates"""
>       self.run(include_unsampled_states=False)

openmmtools\tests\test_sampling.py:315: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
openmmtools\tests\test_sampling.py:199: in run
    simulation.run()
C:\Users\runneradmin\micromamba\envs\openmmtools-test\lib\site-packages\openmmtools\multistate\multistatesampler.py:798: in run
    self._update_timing(iteration_time, partial_total_time, run_initial_iteration, iteration_limit)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <instance of SAMSSampler>, iteration_time = 0.0, partial_total_time = 0.0
run_initial_iteration = 0, iteration_limit = 5000

    def _update_timing(self, iteration_time, partial_total_time, run_initial_iteration, iteration_limit):
        """
        Function that computes and transmits timing information to reporter.
    
        Parameters
        ----------
        iteration_time : float
            Time took in the iteration.
        partial_total_time : float
            Partial total time elapsed.
        run_initial_iteration : int
            Iteration where to start/resume the simulation.
        iteration_limit : int
            Hard limit on number of iterations to be run by the sampler.
        """
        self._timing_data["iteration_seconds"] = iteration_time
        self._timing_data["average_seconds_per_iteration"] = \
            partial_total_time / (self._iteration - run_initial_iteration)
        estimated_timedelta_remaining = datetime.timedelta(
            seconds=self._timing_data["average_seconds_per_iteration"] * (iteration_limit - self._iteration)
        )
        estimated_finish_date = datetime.datetime.now() + estimated_timedelta_remaining
        self._timing_data["estimated_time_remaining"] = str(estimated_timedelta_remaining)  # Putting it in dict as str
        self._timing_data["estimated_localtime_finish_date"] = estimated_finish_date.strftime("%Y-%b-%d-%H:%M:%S")
        total_time_in_seconds = datetime.timedelta(
            seconds=self._timing_data["average_seconds_per_iteration"] * iteration_limit
        )
        self._timing_data["estimated_total_time"] = str(total_time_in_seconds)
    
        # Estimate performance
        moves_iterator = self._flatten_moves_iterator()
        # Only consider "dynamic" moves (timestep and n_steps attributes)
        moves_times = [move.timestep.value_in_unit(unit.nanosecond) * move.n_steps for move in moves_iterator if
                       hasattr(move, "timestep") and hasattr(move, "n_steps")]
        iteration_simulated_nanoseconds = sum(moves_times)
        seconds_in_a_day = (1 * unit.day).value_in_unit(unit.seconds)
>       self._timing_data["ns_per_day"] = iteration_simulated_nanoseconds / (
                    self._timing_data["average_seconds_per_iteration"] / seconds_in_a_day)
E       ZeroDivisionError: float division by zero

C:\Users\runneradmin\micromamba\envs\openmmtools-test\lib\site-packages\openmmtools\multistate\multistatesampler.py:1779: ZeroDivisionError

Copy link

codecov bot commented Aug 20, 2024

Codecov Report

Attention: Patch coverage is 95.00000% with 27 lines in your changes missing coverage. Please review.

Project coverage is 85.09%. Comparing base (9fc8ab7) to head (e54b815).
Report is 14 commits behind head on main.

Additional details and impacted files

@ijpulidos
Copy link
Contributor

I'd vote for making windows support not required. As in, we can make the windows CI job optional.

@ijpulidos
Copy link
Contributor

ijpulidos commented Aug 20, 2024

More importantly, I think we are not running a big chunk of the tests when changing to pytest. Locally I'm getting the following:

  • With nose on main branch: Ran 2044 tests in 2222.346s
  • With pytest (this branch): 342 passed, 14 skipped, 1 xfailed, 1495 warnings in 973.57s (0:16:13)

I'm not sure pytest and nose count tests in the same way but I think this is something we should double check before merging these changes. I can dig into which tests exactly are missing, but maybe related to that xfailed one?

EDITED: I ran pytest with -n 1. So the timings are also significantly different, which might be suspicious.

@ijpulidos
Copy link
Contributor

Do we want to support windows? I am also not sure if this is a "real" error or not, but IMHO we should have an except for ZeroDivisionError and set self._timing_data["ns_per_day"] = NaN or something

Hmm, good question. I don't know if by making it catch the ZeroDivisionError we would be masking other errors, but definitely something to think about.

@mikemhenry
Copy link
Contributor Author

---------- coverage: platform linux, python 3.11.0-final-0 -----------
Name                                           Stmts   Miss  Cover
------------------------------------------------------------------
openmmtools/__init__.py                            8      8     0%
openmmtools/alchemy/__init__.py                    1      1     0%
openmmtools/alchemy/alchemy.py                   864    864     0%
openmmtools/cache.py                             288    288     0%
openmmtools/constants.py                          11     11     0%
openmmtools/forcefactories.py                     69     69     0%
openmmtools/forces.py                            361    361     0%
openmmtools/integrators.py                       764    764     0%
openmmtools/mcmc.py                              463    463     0%
openmmtools/multistate/__init__.py                 7      7     0%
openmmtools/multistate/multistateanalyzer.py    1016   1016     0%
openmmtools/multistate/multistatereporter.py     712    712     0%
openmmtools/multistate/multistatesampler.py      735    735     0%
openmmtools/multistate/paralleltempering.py       57     57     0%
openmmtools/multistate/pymbar.py                  25     25     0%
openmmtools/multistate/replicaexchange.py        118    118     0%
openmmtools/multistate/sams.py                   287    287     0%
openmmtools/multistate/utils.py                   68     68     0%
openmmtools/respa.py                              33     33     0%
openmmtools/scripts/__init__.py                    0      0   100%
openmmtools/scripts/test_openmm_platforms.py     257    257     0%
openmmtools/sobol.py                             159    159     0%
openmmtools/states.py                           1188   1188     0%
openmmtools/storage/__init__.py                    2      2     0%
openmmtools/storage/iodrivers.py                 713    713     0%
openmmtools/storage/storageinterface.py          138    138     0%
openmmtools/testsystems.py                      1422   1422     0%
openmmtools/utils/__init__.py                      2      2     0%
openmmtools/utils/equilibration.py                74     74     0%
openmmtools/utils/utils.py                       361    361     0%
------------------------------------------------------------------
TOTAL                                          10203  10203     0%
Coverage XML written to file coverage.xml

I've seen this before, I think it has to do with not doing a dev install

@ijpulidos
Copy link
Contributor

ijpulidos commented Aug 21, 2024

I ran the tests locally and this is what I'm getting for coverage reports. It seems fine as far as I can tell. (nose ran on main branch, and pytest ran on this branch)

❯ nosetests openmmtools/tests --nocapture --cover-tests --with-coverage --cover-package=openmmtools --verbosity=2 --with-timer --with-doctest -a '!slow'

Name                                           Stmts   Miss  Cover
------------------------------------------------------------------
openmmtools/__init__.py                            7      0   100%
openmmtools/_version.py                          279    279     0%
openmmtools/alchemy/__init__.py                    1      0   100%
openmmtools/alchemy/alchemy.py                   890    115    87%
openmmtools/cache.py                             288     22    92%
openmmtools/constants.py                          11      2    82%
openmmtools/forcefactories.py                     70     12    83%
openmmtools/forces.py                            362     24    93%
openmmtools/integrators.py                       765     47    94%
openmmtools/mcmc.py                              463     38    92%
openmmtools/multistate/__init__.py                 8      0   100%
openmmtools/multistate/multistateanalyzer.py    1017    512    50%
openmmtools/multistate/multistatereporter.py     713     74    90%
openmmtools/multistate/multistatesampler.py      736    134    82%
openmmtools/multistate/paralleltempering.py       58      8    86%
openmmtools/multistate/pymbar.py                  25      8    68%
openmmtools/multistate/replicaexchange.py        119     32    73%
openmmtools/multistate/sams.py                   288     92    68%
openmmtools/multistate/utils.py                   70     26    63%
openmmtools/respa.py                              33      6    82%
openmmtools/scripts/__init__.py                    0      0   100%
openmmtools/scripts/test_openmm_platforms.py     258    236     9%
openmmtools/sobol.py                             157     58    63%
openmmtools/states.py                           1189     69    94%
openmmtools/storage/__init__.py                    3      0   100%
openmmtools/storage/iodrivers.py                 714     66    91%
openmmtools/storage/storageinterface.py          137     10    93%
openmmtools/testsystems.py                      1421    145    90%
openmmtools/utils/__init__.py                      2      0   100%
openmmtools/utils/equilibration.py                74     12    84%
openmmtools/utils/utils.py                       362     42    88%
------------------------------------------------------------------
TOTAL                                          10520   2069    80%


❯ pytest -n 1 -v --cov-report term --cov=openmmtools --color=yes openmmtools/tests/

---------- coverage: platform linux, python 3.9.16-final-0 -----------
Name                                           Stmts   Miss  Cover
------------------------------------------------------------------
openmmtools/__init__.py                            9      0   100%
openmmtools/alchemy/__init__.py                    1      0   100%
openmmtools/alchemy/alchemy.py                   890    115    87%
openmmtools/cache.py                             288     22    92%
openmmtools/constants.py                          11      2    82%
openmmtools/forcefactories.py                     70     12    83%
openmmtools/forces.py                            362     24    93%
openmmtools/integrators.py                       765     48    94%
openmmtools/mcmc.py                              463     52    89%
openmmtools/multistate/__init__.py                 8      0   100%
openmmtools/multistate/multistateanalyzer.py    1017    512    50%
openmmtools/multistate/multistatereporter.py     713     74    90%
openmmtools/multistate/multistatesampler.py      736    134    82%
openmmtools/multistate/paralleltempering.py       58      8    86%
openmmtools/multistate/pymbar.py                  25      8    68%
openmmtools/multistate/replicaexchange.py        119     32    73%
openmmtools/multistate/sams.py                   288     92    68%
openmmtools/multistate/utils.py                   70     26    63%
openmmtools/respa.py                              33      6    82%
openmmtools/scripts/__init__.py                    0      0   100%
openmmtools/scripts/test_openmm_platforms.py     258    236     9%
openmmtools/sobol.py                             157     58    63%
openmmtools/states.py                           1189     69    94%
openmmtools/storage/__init__.py                    3      0   100%
openmmtools/storage/iodrivers.py                 714     66    91%
openmmtools/storage/storageinterface.py          137     10    93%
openmmtools/testsystems.py                      1421    145    90%
openmmtools/utils/__init__.py                      2      0   100%
openmmtools/utils/equilibration.py                74     12    84%
openmmtools/utils/utils.py                       362     42    88%
------------------------------------------------------------------
TOTAL                                          10243   1805    82%

@ijpulidos
Copy link
Contributor

Oh, nvm, I do see a difference in testing the openmmtools/mcmc.py

@mikemhenry
Copy link
Contributor Author

@mikemhenry
Copy link
Contributor Author

@mikemhenry
Copy link
Contributor Author

Slow tests passed! It took 3.5 hours, so ~$1.82 per run

@ijpulidos
Copy link
Contributor

@mikemhenry nice! We would need to separate the ruff/black/formatter changes in a separate commit, that way it's going to be way easier for me to review, and hopefully also ignore those so that we don't change the blame, which I think it's important to maintain.

@mikemhenry
Copy link
Contributor Author

@ijpulidos when we merge this in, it will be squashed into a single commit we can add it to .git-blame-ignore-revs https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view

We would need to separate the ruff/black/formatter changes in a separate commit

I will see if I can rebase things that way

@mikemhenry
Copy link
Contributor Author

Or maybe it would be better to squash everything into two commits, automated refactor and non-automated, and only add the auotmated refactor commit to .git-blame-ignore-revs and merge in the pr instead of sqash and merge

@ijpulidos
Copy link
Contributor

Or maybe it would be better to squash everything into two commits, automated refactor and non-automated, and only add the auotmated refactor commit to .git-blame-ignore-revs and merge in the pr instead of sqash and merge

Yes, that would definitely make the review easier. Considering it's such a big set of changes otherwise, I would just review the non-automated commit.

@mikemhenry
Copy link
Contributor Author

I will get this rebased into a few smaller commits, I don't think I can get it into two, but I will tag + link the non-automated commits when it is ready

@mikemhenry
Copy link
Contributor Author

Okay I've rebased a few commits on a different branch (If I made a mistake I didn't want to wreck all of my work). Once you are good with it, I will add a .git-blame-ignore-revs and merge in that branch.

Commits to review:
c710d68
c8882e1
be3287b
c5c9acc

Commits to add to .git-blame-ignore-revs (automated formatting stuff)
562e0d3
7ebe1ab
e7144c1 # this one is where I updated versioneer

Copy link
Contributor

@ijpulidos ijpulidos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great! Finally we can move to a modern unit testing infrastructure and GPU CI. There are just minor non-blocking comments.

I think we have talked about how nose counts tests differently compared to pytest and that the coverage information should account for this. We want to make sure we are doing as good or better in terms of coverage.

Great work! LGTM!

openmmtools/tests/test_sampling.py Show resolved Hide resolved
openmmtools/tests/test_storage_interface.py Show resolved Hide resolved
openmmtools/tests/test_utils.py Show resolved Hide resolved
.github/workflows/CI.yml Show resolved Hide resolved
devtools/conda-envs/test_env.yaml Show resolved Hide resolved
@@ -18,15 +18,14 @@ dependencies:
- python
- python
- pyyaml
- scipy
- scipy <1.14
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking comment. Why is it that we are not supporting scipy 1.14+? I wonder if we then have to restrict the version on the conda-forge package and also patch/change all the currently released packages so that older versions don't get installed when environment with scipy 1.14 are resolved. I hope that makes sense.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will relax the pin and see if we have any issues, I am pretty sure it was actually something to do with nose testing or a numba thing

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that should be a quick test, great!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I think it was probably related to this, which should've been already fixed on the openmm side. But if we are supporting older OpenMM releases then we might still need the pin. #743

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we will need the pin until this gets merged in conda-forge/openmm-feedstock#139 + we patch the older releases

.github/workflows/gpu-runner.yaml Show resolved Hide resolved
@ijpulidos ijpulidos added this to the 0.23.2 milestone Sep 12, 2024
mikemhenry added a commit that referenced this pull request Sep 13, 2024
Update to PyTest (Rebased Edition)
See #741
@mikemhenry
Copy link
Contributor Author

Done in #746

@mikemhenry mikemhenry closed this Sep 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants