All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- Added a docker based SLURM cluster in the CI pipeline for testing the plugin.
- Ignoring
tests/docker_tests
directory from pytest.
ssh_key_file
andcert_file
paths will now be expanded and resolved to their absolute paths instead of requiring to be absolute in the first place.- Added
ignore_versions
parameter toSlurmExecutor
to allow for ignoring the versions ofpython
,covalent
andcloudpickle
when submitting jobs on the remote machine. - Added further improved error messages in the slurm script.
- Added docker file and its dependent files so that the plugin can be tested using a docker container running slurm.
- Added README.md for the docker tests with steps to manually test the plugin.
- Changed the strict version pin of
aiofiles
to be>=
instead of==
.
- Excluding tests from the built package.
- Removed python version pin for pre-commit hooks.
- add a new
variables
parameter for environment variables - add a new error-catching python execution script (add new module)
- add checks inside submit script for
covalent
andcloudpickle
versions - clean up job script creation (add new module)
- export
COVALENT_CONFIG_DIR=/tmp
inside sbatch script to enable filelock
- update plugin defaults to use
BaseModel
instead ofdict
- change to actually get errors from these checks
- use
Path
everywhere instead ofos.path
operations - allow
poll_freq >= 10
seconds, instead of 60 seconds - misc. cleanups and refactoring
- Aesthetics and string formatting
- remove addition of
COVALENT_CONFIG_DIR=/tmp
to sbatch script - Removed the
sshproxy
interface. - Updates init signature kwargs replaced with parent for better documentation.
- Updated license to Apache
- Add missing
,
to README.
- A new config variable,
bashrc_path
, which is the path to the bashrc script to source.
- Removed automatic sourcing of
$HOME/.bashrc
from the SLURM submit script.
- Does not put conda-related lines in SLURM script if
conda_env
is set toFalse
or""
. - Changed default config value of
conda_env
fromNone
to""
. - A proper
ValueError
will now be raised ifssh_key_file
is not supplied.
- A new kwarg,
use_srun
, that allows for the user to specify whether to usesrun
when running the pickled Python function. - Added docstring for
sshproxy
- A new kwarg
create_unique_workdir
that will create unique subfolders of the type<DISPATCH ID>/node_<NODE ID>
withinremote_workdir
if set toTrue
- Fixed a bug where
cleanup = False
would be ignored. - Fixed a bug where if
cache_dir
was not present, Covalent would crash.
- Updated pre-commit hooks
- Moved executor validations out of constructor
- Fixed license CI workflow
- Basic support for NERSC's sshproxy tool which uses MFA to generate SSH keys
- Added instructions to the
README
for the remote machine's dependencies.
- Automatically apply the
"parsable": ""
option by default if not set by the user.
- Modified executor to use
srun
in slurm script, instead of injecting python code and calling python directly. - Added new parameters to
SlurmExecutor
to allow finer control of jobs via options forsrun
and in-script commands (see README.md). - Added
srun_append
parameter allowing insertion of intermediate command (see README.md). - Added
prerun_commands
andpostrun_commands
parameters allowing execution of in-script shell commands before and after the workflow submission viasrun
(see README.md).
- Added a new kwarg,
cert_file
, toSlurmExecutor
that allows for a certificate file to be passed.
- Changed the
_client_connect
function to output the connection object only since the first positional argument cannot get used.
- Added Alejandro to paul blart group
- Changed BaseAsyncExecutor to AsyncBaseExecutor
- Updated covalent version to >=0.202.0,<1
- Added license workflow
- Enabled Codecov
SlurmExecutor
can now be import directly fromcovalent_slurm_plugin
- Added several debug log statements to track progress when debugging
asyncssh
added as a requirement- Added support for performing cleanup on remote machine (default is True) once execution completes
- Added
slurm_path
for users to provide a path for slurm commands if they aren't detected automatically
- Default values set for some
SlurmExecutor
initialization parameters - Since there were several ssh calls, thus now using
asyncssh
module for a uniform interface to run ssh commands on remote machine - File transfer to and from is now done using
scp
instead ofrsync
- Fixed returning only
result
fromrun
method instead of returningstdout
andstderr
as well, which are now printed directly appropriately
- Updated tests to reflect above changes
- Updated
covalent
version tostable
- Restore
cache_dir
parameter to constructor
- Banner file extension
- Updated readme banner
- Fixed test references to conda
- Slurm executor is now async aware. Internal subprocess calls are now awaited.
- Tests have been updated to reflect above changes.
- New logo to reflect revamp in UI.
- Reverted some changes in slurm.py.
- Handle exceptions correctly
- Workflows are fixed
- Unit tests written and added to the .github workflows.
- The function is deserialized before sending to the remote machine. This allows the remote machine to execute the fuction in a "vanilla" python, and not need Covalent to be installed.
- The args and kwargs inputs to the function to be executed are pickled into the same file as the function, for transport to the remote machine.
- Fixed full local path to where result files were being copied back from remote
- Pass in dictionary to
self.get_status
instead ofstr
- The python version on the remote machine only has to be equal to the python version which created the function to be executed down to the minor version. Eg, matching 3.8, instead of matching 3.8.13.
- Modified slurm.py to be compatible with the refactored Covalent codebase.
- Add time module import back to
slurm.py
- Updated how slurm job id is retrieved from
proc.stdout
using regex
- Changed global variable executor_plugin_name -> EXECUTOR_PLUGIN_NAME in executors to conform with PEP8.
- Enabled PyPI upload
- Core files for this repo.
- CHANGELOG.md to track changes (this file).
- Semantic versioning in VERSION.
- CI pipeline job to enforce versioning.