Skip to content

Releases: BerkeleyLab/fiats

Shorter test-suite execution time

25 May 19:28
6e0ea8f
Compare
Choose a tag to compare

This release updates the test for the new experimental capability for training neural networks. The current unit tests verify convergence for training single-hidden-layer networks using gradient descent with updates averaged across mini-batches of input/output pairs. Future work will include verifying the training of deep neural networks and introducing stochastic gradient descent.

This release replaces the deleted 0.6.1 release. Relative to release 0.6.1, the current release provides

  1. Nearly an order of magnitude reduction in execution time for the mini-batch training example and for the similar unit test,
  2. A fix for the training example to invoke the correct object definition function for the involved trainable_engine_t object, and
  3. A fix for the inference_engine_t type-bound procedure to_json() to eliminate an erroneous trailing comma that led to invalid JSON output for networks with a single hidden layer.

What's Changed

  • Fix json output, actually shorten test-suite runtime by @rouson in #58

Full Changelog: 0.6.1...0.6.2

Experimental training capability

24 May 23:23
ed0afa5
Compare
Choose a tag to compare

This is the first release with an experimental capability for training neural networks. The current unit tests verify convergence for single-hidden-layer networks using gradient descent with updates averaged across mini-batches of input/output pairs. Future work will include verifying the training of deep neural networks and introducing stochastic gradient descent.

What's Changed

  • CI: install gfortran on macOS by @everythingfunctional in #48
  • Encapsulate all output and store unactivated weighted/biased neuron output for training by @rouson in #47
  • Add activation-function derivative functions by @rouson in #46
  • fix(setup.sh): brew install netcdf-fortran by @rouson in #45
  • fix(netcdf-interfaces): patch for fpm v >= 0.8 by @rouson in #49
  • Test: add test for single-layer perceptron by @rouson in #50
  • Add back-propagation by @rouson in #51
  • Export everything via one common module by @rouson in #52
  • Add nominally complete training algorithm by @rouson in #53
  • doc(README): mention training & additional future work by @rouson in #55
  • Group input/output pairs into mini-batches by @rouson in #54
  • doc(README): mention experimental training feature by @rouson in #56

New Contributors

Full Changelog: 0.5.0...0.6.0

0.5.0

20 Feb 01:04
bef79dc
Compare
Choose a tag to compare

What's Changed

  • Documentation: link to nexport and ICAR in README.md by @rouson in #41
  • Extensible metadata, swish activation function, explicit inference strategy, update documentation by @rouson in #42
  • Fix JSON reader and writer by @rouson in #43

Full Changelog: 0.4.1...0.5.0

Bug fixes and more stringent unit testing

15 Feb 01:00
cd9ad56
Compare
Choose a tag to compare

What's Changed

  • More stringent unit test for elemental inference by @rouson in #39
  • Fix multi-output network reads from JSON files by @rouson in #40

Full Changelog: 0.4.0...0.4.1

0.4.0 Support networks with skip connections (bypass)

14 Feb 07:32
9a1095e
Compare
Choose a tag to compare

What's Changed

  • Update README.md by @rouson in #31
  • Feature: add skip-connection network support and corresponding unit tests by @rouson in #32
  • More comprehensive unit testing by @rouson in #33
  • Feature: Enforce invariant inference_engine_t self-consistency by @rouson in #34
  • Feature: read metadata, including skip connection specification, if present by @rouson in #38

Full Changelog: 0.2.1...0.4.0

Elemental inference, real kind parameter, & layer/neuron count fix

03 Feb 07:09
59ac14b
Compare
Choose a tag to compare

What's Changed

  • Feature: make infer generic and add elemental inference procedure by @rouson in #27
  • Feature: use real kind parameters everywhere by @rouson in #28
  • Feature: make file_t constructor elemental by @rouson in #29
  • Fix layer_t count_neuron & count_layers type-bound procedures by @rouson in #30

Full Changelog: 0.2.0...0.3.0

Fix layer_t/neuron_t count_layers and count_neurons type-bound

03 Feb 05:36
59ac14b
Compare
Choose a tag to compare

What's Changed

  • Feature: make infer generic and add elemental inference procedure by @rouson in #27
  • Feature: use real kind parameters everywhere by @rouson in #28
  • Feature: make file_t constructor elemental by @rouson in #29
  • Fix layer_t count_neuron & count_layers type-bound procedures by @rouson in #30

Full Changelog: 0.2.0...0.2.1

0.2.0 JSON file read/write capability, more accurate default inference strategy

23 Jan 00:24
790cff0
Compare
Choose a tag to compare

What's Changed

Full Changelog: 0.1.2...0.2.0

0.1.2 Bug fix and documentation edits

02 Dec 08:41
08bd468
Compare
Choose a tag to compare

What's Changed

  • fix(README.md): correct typo, eliminate redundancy by @rouson in #21
  • Fix handling of example-program input arguments in run-fpm.sh script by @rouson in #22
  • Increment version number to 0.1.2 by @rouson in #23

Full Changelog: 0.1.1...0.1.2

0.1.1

29 Nov 03:53
6addc82
Compare
Choose a tag to compare

What's Changed

  • Prepare open-source release by @rouson in #18
    • Add LICENSE.txt file with copyright notice and license agreement
    • Add statement referring to the license at the top of each source file
    • Add build instructions to the README.md
    • Add a basic ford project file
    • Set up the CI to post the ford documentation to GitHub Pages
  • Add asymmetric network test by @rouson in #19
    • The new test uses a network that encodes a two-input/one-output digital logic circuit that performs operations equivalent to "XOR AND input-2".
  • Fix asymmetric-network test of matmul-based inference by @rouson in #20
    • address an issue that the new asymmetric test exposed.

Full Changelog: 0.1.0...0.1.1