This project contains a set of examples that demonstrate the use of the SameDiff API. SameDiff is our automatic differentiation / deep learning framework. SameDiff uses a graph-based (define then run) approach, similar to TensorFlow graph mode. Eager graph (TensorFlow 2.x eager/PyTorch) graph execution is planned. SameDiff supports importing TensorFlow frozen model format .pb (protobuf) models. Import for ONNX, TensorFlow SavedModel and Keras models are planned. Note that Deeplearning4j also has full SameDiff support for easily writing custom layers and loss functions. Examples of importing TF models can be found here
It is to be noted that neural networks can also be build using the higher level MultiLayerNetwork and ComputationalGraph DL4J APIs as noted here
Go back to the main repository page to explore other features/functionality of the Eclipse Deeplearning4J ecosystem. File an issue here to request new features.
The examples in this project and what they demonstrate are briefly described below. This is also the recommended order to explore them in.
- Ex1_SameDiff_Basics.java SameDiff class, variables, functions and forward pass
- Ex2_LinearRegression.java Placeholders, forward pass and gradient calculations on a simple linear regression graph
- Ex3_Variables.java Alternate ways to create variables
- MNISTFeedforward.java Create, train, evaluate, save and load a basic feedforward network using SameDiff.
- MNISTCNN.java The same as the above but with a CNN network
- CustomListenerExample.java Implementing a basic custom listener that records the values of 2 variables, for comparison or printing later.
DL4J has supported custom layers for a long time. However, using SameDiff layers has some advantages described here.
- Ex1BasicSameDiffLayerExample.java Implement a custom DL4J layer using SameDiff.
- Ex2LambdaLayer.java Implement a simple custom DL4J lambda layer using SameDiff.
- Ex3LambdaVertex.java Implement a simple custom DL4J lambda vertex using SameDiff.