In the changelog, @ElArkk and @ericmjl would like to acknowledge contributors who have helped us with anything on the project, big or small.
- 12 August 2022: Fixed jax dependency imports, by @aaroncsolomon
- 23 December 2020: Snuck in a fix for incorrect logger info, by @ericmjl.
- 23 December 2020: Fixed bug with NaN values in grad (issue #94), by @ericmjl
- h/t @r-karimi for discovering the bug.
- 14 December 2020: Evotune log format bugfix by @ericmjl and @ElArkk
- Reported by @jmahenriques in issue #93
- 2 December 2020: Implementation of embedding layer by @ElArkk
- Initial AA embedding layer is now trainable and of flexible size
- Pre-existing and dumped weights get stored in pkl format
- get_reps accepts variable size mLSTMs
- 20 November 2020: Major rework of fitting API, plus bugfixes by @ericmjl
- Custom model architectures can now be passed to
fit
- Refactored lots of utility functions in the evotuing process for better readability
- Oscillating output bug of
get_reps
fixed (thank you @hhefzi, @tanggis and @hypostulate !) - Confusing logging statements regarding length and random batching updated
- Custom model architectures can now be passed to
- 29 August 2020: Fixed setup.py so that PEP 517 calls such as
pip install .
work, by @konstin. - 29 August 2020: Require python 3.6 instead 3.7, by @konstin.
- 20 April 2020: Code fixes for major bug with negative and NaN losses due to Softmax issue by @ivanjayapurna,
- 20 April 2020: (Also by @ivanjayapurna) Overhauled evotuning.py with major changes including
- option to supply an out-domain holdout set and print params as training progresses,
- evotuning without Optuna by directly calling fit function,
- added avg_loss() function for calculation outputting of training and holdout set loss to a log file (number and length of batches are also calculated and printed to log file),
- introduction of "epochs_per_print" to periodically calculate losses and dump parameters
- Implemented adamW in JAX and switched optimizer to adamW,
- added option to change the number of folds in optuna KFolds,
- update evotuning-prototype.py example script
- 30 March 2020: Code fixes for correctness and readability, and a parameter dumping function by @ivanjayapurna,
- 28 June 2020: Improvements to evotuning ergonomics, by @ericmjl
- Adds a pre-commit configuration.
- Adds an installation script that makes easy the installation of jax on GPU.
- Provided backend specification of device (GPU/CPU).
- Switched preparation of sequences as input-output pairs exclusively on CPU, for speed.
- Added ergonomic UI features - progressbars! - that improve user experience.
- Added docs on recommended batch size and its relationship to GPU RAM consumption.
- Switched from exact calculation of train/holdout loss to estimated calculation.
- 9 July 2020: Add progress bar to sequence sampler, by @ericmjl