Releases: google/jaxonnxruntime
Releases · google/jaxonnxruntime
v0.3.0
- Create a call_torch API to make Pytorch a frontend of JAX. The call_torch API achieves two tasks:
(1) Convert the torch nn.Module or torch scriptmodel to onnx format by torch.onnx.export
(2) Use jaxonnxruntime to convert the ONNX model into JAX function
We hook all necessary components to make life easier for end PyTorch users. The input format should follow the torch.onnx.export API.
see https://github.com/google/jaxonnxruntime/blob/main/docs/experimental_call_torch_tutorial.ipynb
-
Improve the tests to cover more corner cases and be more robust
-
Support more models and tests from ONNX backend test suite. see all supported model list here https://github.com/google/jaxonnxruntime/blob/6c2880f5e13beb9038017745eec253cb7731cd31/tests/onnx_models_test.py#L117-L159
V0.2.0: Support more ops and onnx model zoo models
We bump version.py into "0.2.0"
Changes includes:
- Add more onnx ops and onnx model zoo model (42) support.
- Run through Meta LLAMA model end 2 end flow.
- Add doc how to contribute new op.
v0.1.0
This version passes all 'real' model tests provided by onnx.