Releases: BerkeleyLab/fiats
Shorter test-suite execution time
This release updates the test for the new experimental capability for training neural networks. The current unit tests verify convergence for training single-hidden-layer networks using gradient descent with updates averaged across mini-batches of input/output pairs. Future work will include verifying the training of deep neural networks and introducing stochastic gradient descent.
This release replaces the deleted 0.6.1 release. Relative to release 0.6.1, the current release provides
- Nearly an order of magnitude reduction in execution time for the mini-batch training example and for the similar unit test,
- A fix for the training example to invoke the correct object definition function for the involved
trainable_engine_t
object, and - A fix for the
inference_engine_t
type-bound procedureto_json()
to eliminate an erroneous trailing comma that led to invalid JSON output for networks with a single hidden layer.
What's Changed
Full Changelog: 0.6.1...0.6.2
Experimental training capability
This is the first release with an experimental capability for training neural networks. The current unit tests verify convergence for single-hidden-layer networks using gradient descent with updates averaged across mini-batches of input/output pairs. Future work will include verifying the training of deep neural networks and introducing stochastic gradient descent.
What's Changed
- CI: install
gfortran
on macOS by @everythingfunctional in #48 - Encapsulate all output and store unactivated weighted/biased neuron output for training by @rouson in #47
- Add activation-function derivative functions by @rouson in #46
- fix(setup.sh): brew install netcdf-fortran by @rouson in #45
- fix(netcdf-interfaces): patch for fpm v >= 0.8 by @rouson in #49
- Test: add test for single-layer perceptron by @rouson in #50
- Add back-propagation by @rouson in #51
- Export everything via one common module by @rouson in #52
- Add nominally complete training algorithm by @rouson in #53
- doc(README): mention training & additional future work by @rouson in #55
- Group input/output pairs into mini-batches by @rouson in #54
- doc(README): mention experimental training feature by @rouson in #56
New Contributors
- @everythingfunctional made their first contribution in #48
Full Changelog: 0.5.0...0.6.0
0.5.0
Bug fixes and more stringent unit testing
What's Changed
- More stringent unit test for elemental inference by @rouson in #39
- Fix multi-output network reads from JSON files by @rouson in #40
Full Changelog: 0.4.0...0.4.1
0.4.0 Support networks with skip connections (bypass)
What's Changed
- Update README.md by @rouson in #31
- Feature: add skip-connection network support and corresponding unit tests by @rouson in #32
- More comprehensive unit testing by @rouson in #33
- Feature: Enforce invariant
inference_engine_t
self-consistency by @rouson in #34 - Feature: read metadata, including skip connection specification, if present by @rouson in #38
Full Changelog: 0.2.1...0.4.0
Elemental inference, real kind parameter, & layer/neuron count fix
What's Changed
- Feature: make
infer
generic and add elemental inference procedure by @rouson in #27 - Feature: use real kind parameters everywhere by @rouson in #28
- Feature: make file_t constructor elemental by @rouson in #29
- Fix layer_t count_neuron & count_layers type-bound procedures by @rouson in #30
Full Changelog: 0.2.0...0.3.0
Fix layer_t/neuron_t count_layers and count_neurons type-bound
What's Changed
- Feature: make
infer
generic and add elemental inference procedure by @rouson in #27 - Feature: use real kind parameters everywhere by @rouson in #28
- Feature: make file_t constructor elemental by @rouson in #29
- Fix layer_t count_neuron & count_layers type-bound procedures by @rouson in #30
Full Changelog: 0.2.0...0.2.1
0.2.0 JSON file read/write capability, more accurate default inference strategy
0.1.2 Bug fix and documentation edits
0.1.1
What's Changed
- Prepare open-source release by @rouson in #18
- Add LICENSE.txt file with copyright notice and license agreement
- Add statement referring to the license at the top of each source file
- Add build instructions to the README.md
- Add a basic
ford
project file - Set up the CI to post the
ford
documentation to GitHub Pages
- Add asymmetric network test by @rouson in #19
- The new test uses a network that encodes a two-input/one-output digital logic circuit that performs operations equivalent to "XOR AND input-2".
- Fix asymmetric-network test of
matmul
-based inference by @rouson in #20- address an issue that the new asymmetric test exposed.
Full Changelog: 0.1.0...0.1.1