Releases: chainer/onnx-chainer
Releases · chainer/onnx-chainer
v1.2.2a3
v1.1.1a2
v1.1.1a1
This is the release of onnx-chainer v1.1.1a1. This version supports ONNX v1.1.1 and tested with MXNet 1.2.0 and TVM at this commit ID: ebdde3c277a9807a67b233cecfaf6d9f96c0c1bc
.
Main Update
- Refactored
export.py
and switched to usechainer.FunctionHook
to catch the arguments to every functions called in a network which is converted into ONNX. - Added two compatibility tests with other framework. Now the exported ONNX files of VGG16 and ResNet50 written in Chainer are confirmed that the same outputs can be obtained after importing them using MXNet and NNVM/TVM.
- We added two examples of
v1.0.0a1
v0.2.1b4
v0.2.1b3
ONNX-Chainer
This is an add-on package for ONNX support by Chainer.
Requirements
- onnx==0.2.1
- chainer>=3.1.0
Installation
See INSTALL.md
Quick Start
import numpy as np
import chainer.links as L
import onnx_chainer
model = L.VGG16Layers()
# Pseudo input
x = np.zeros((1, 3, 224, 224), dtype=np.float32)
onnx_chainer.export(model, x, filename='VGG16.onnx')
Supported Functions
Currently 50 Chainer Functions are supported to export in ONNX format.
Activation
- ELU
- HardSigmoid
- LeakyReLU
- LogSoftmax
- PReLUFunction
- ReLU
- Sigmoid
- Softmax
- Softplus
- Tanh
Array
Connection
- Convolution2DFunction
- ConvolutionND
- Deconvolution2DFunction
- DeconvolutionND
- EmbedIDFunction 3
- LinearFunction
Math
- Add
- Absolute
- Div
- Mul
- Neg
- PowVarConst
- Sub
- Clip
- Exp
- Identity
- MatMul 4
- Maximum
- Minimum
- Sqrt
- SquaredDifference
- Sum
Noise
- Dropout 5
Pooling
- AveragePooling2D
- AveragePoolingND
- MaxPooling2D
- MaxPoolingND
Normalization
- BatchNormalization
- FixedBatchNormalization
- LocalResponseNormalization
1: mode should be either 'constant', 'reflect', or 'edge'
2: ONNX doesn't support multiple constant values for Pad operation
3: Current ONNX doesn't support ignore_label for EmbedID
4: Current ONNX doesn't support transpose options for matmul ops
5: In test mode, all dropout layers aren't included in the exported file
v0.2.1b1
Added many functions. Now it supports:
Activation
- ELU
- HardSigmoid
- LeakyReLU
- LogSoftmax
- PReLUFunction
- ReLU
- Sigmoid
- Softmax
- Softplus
- Tanh
Array
- Cast
- Concat
- Depth2Space
- Pad
- Reshape
- Space2Depth
- SplitAxis
- Squeeze
- Tile
- Transpose
Connection
- Convolution2DFunction
- LinearFunction
Pooling
- AveragePooling2D
- MaxPooling2D
Normalization
- BatchNormalization
- FixedBatchNormalization
Math
- Add
- Sub
- Mul
- Neg
- Absolute
- Div