Releases: modern-fortran/neural-fortran
neural-fortran-0.10.0
This release introduces get_num_params
, get_params
, and set_params
methods to network
and layer
derived types, and allow you to more easily get and set network hyperparameters from custom Fortran code or other libraries. See the example to learn how it works.
Thanks to Christopher Zapart @jvo203 for this feature contribution.
What's Changed
- Fix CMake build by @milancurcic in #110
- Get set network parameters by @milancurcic in #111
Full Changelog: v0.9.0...v0.10.0
neural-fortran-0.9.0
This release introduces the backward passes for the conv2d
and maxpool2d
layers and enables the training of convolutional networks.
What's Changed
- CNN backward pass by @milancurcic in #99
Full Changelog: v0.8.0...v0.9.0
neural-fortran-0.8.0
This release introduces the reshape
layer for connecting rank-1 layers to rank-3 layers, including the capability to read Keras's Reshape
layer from h5 files.
What's Changed
- Bump version in fpm.toml by @milancurcic in #96
- Implement the reshape layer by @milancurcic in #97
Full Changelog: v0.7.0...v0.8.0
neural-fortran-0.7.0
What's Changed
- CI by @milancurcic in #87
- Batch inference by @milancurcic in #90
- Rename output -> predict for consistency with Keras by @milancurcic in #92
- Update README by @milancurcic in #93
- Fix accidental TOC reorder from a previous PR by @milancurcic in #94
Full Changelog: v0.6.0...v0.7.0
neural-fortran-0.6.0
What's Changed
- Update fpm instructions in light of the required HDF5 dependency by @milancurcic in #80
- Add CITATION.cff by @milancurcic in #81
- fixed a memory leak when reading the JSON by @jacobwilliams in #83
- Update contributors list by @milancurcic in #84
- Read Conv2D, MaxPooling2D, and Flatten layers from Keras by @milancurcic in #85
New Contributors
- @jacobwilliams made their first contribution in #83
Full Changelog: v0.5.0...v0.6.0
neural-fortran-0.5.0
What's Changed
- Support for loading Keras models by @milancurcic in #79. Many thanks to @scivision for a complete CMake overhaul and for assisting with adding h5fortran as a dependency. This is an experimental and minimally tested feature, supporting only input and dense layers saved in a Keras HDF5 file. See the mnist_from_keras example to see how it works.
- HDF5, h5fortran, and json-fortran are now required dependencies. HDF5 you have to provide to the build system. The latter two are taken care of automatically.
Full Changelog: v0.4.0...v0.5.0
neural-fortran-0.4.0
What's Changed
- Add note about downloading MNIST data without curl by @milancurcic in #67
- Forward pass for the conv2d layer by @milancurcic in #65
- Forward pass for a max-pooling layer by @milancurcic in #66
- Fix layers summary table by @milancurcic in #70
- fix #72 - use dir tree to expose user API by @rouson in #74
- Clean up example and add a note to emphasize the user API by @milancurcic in #76
- Fix CMake build for the new src directory structure by @milancurcic in #77
- Implement a flatten layer by @milancurcic in #75
Full Changelog: v0.3.0...v0.4.0
neural-fortran-0.3.0
neural-fortran-0.2.0
What's Changed
- Refactor: move procedure definitions submodules by @rouson in #51
- Delete CAF C preprocessor macro by @rouson in #52
- Download mnist.tar.gz if it is missing by @rouson in #55
- Remove instructions about downloading MNIST data by @milancurcic in #56
- Workaround for submodule-related bug in GFortran-9 by @milancurcic in #59
Full Changelog: v0.1.0...v0.2.0
neural-fortran-0.1.0
This release is a snapshot of neural-fortran before the refactor to submodules (see #51), which raises the requirement for compiler versions.