Skip to content

Releases: google/jaxonnxruntime

v0.3.0

18 Aug 18:00
Compare
Choose a tag to compare
  1. Create a call_torch API to make Pytorch a frontend of JAX. The call_torch API achieves two tasks:
    (1) Convert the torch nn.Module or torch scriptmodel to onnx format by torch.onnx.export
    (2) Use jaxonnxruntime to convert the ONNX model into JAX function
    We hook all necessary components to make life easier for end PyTorch users. The input format should follow the torch.onnx.export API.

see https://github.com/google/jaxonnxruntime/blob/main/docs/experimental_call_torch_tutorial.ipynb

  1. Improve the tests to cover more corner cases and be more robust

  2. Support more models and tests from ONNX backend test suite. see all supported model list here https://github.com/google/jaxonnxruntime/blob/6c2880f5e13beb9038017745eec253cb7731cd31/tests/onnx_models_test.py#L117-L159

V0.2.0: Support more ops and onnx model zoo models

25 Jul 15:38
Compare
Choose a tag to compare

We bump version.py into "0.2.0"

Changes includes:

  1. Add more onnx ops and onnx model zoo model (42) support.
  2. Run through Meta LLAMA model end 2 end flow.
  3. Add doc how to contribute new op.

v0.1.0

20 Jun 22:58
Compare
Choose a tag to compare
v0.1.0 Pre-release
Pre-release

This version passes all 'real' model tests provided by onnx.