This repository houses a collection of popular machine learning models written in the Ivy framework.
Code written in Ivy is compatible with PyTorch, TensorFlow, JAX and NumPy. This means that these models can be integrated into a working pipeline for any of these standard ML frameworks.
The purpose of this repository is to provide reference Ivy implementations of common machine learning models, as well as giving a demonstration of how to write custom models in Ivy.
Check out our demos to see these models in action. In particular, UNet and AlexNet demonstrate using models from this repository.
The models can be loaded with pretrained weights, we have tests to ensure that our models give the same output as the reference implementation.
Models can also be initialised with random weights by passing pretrained=False
to the loading function.
To learn more about Ivy, check out unify.ai, our Docs, and our GitHub.
git clone https://github.com/unifyai/models
cd models
pip install .
pip install -r requirements.txt # this is not redundant, it installs latest ivy code which is a dependency 😄
import ivy
from ivy_models import alexnet
ivy.set_backend("torch")
model = alexnet()
The pretrained AlexNet model is now ready to be used, and is compatible with any other PyTorch code. See this demo for more details.
The models are contained in the ivy_models folder.
The functions that automatically load the pretrained weights are found at the end of model_name.py
, some models have multiple sizes.
The layers are sometimes kept in a separate file, usually named layers.py
.
Off-the-shelf models for a variety of domains.
Ivy Libraries
There are a host of derived libraries written in Ivy, in the areas of mechanics, 3D vision, robotics, gym environments, neural memory, pre-trained models + implementations, and builder tools with trainers, data loaders and more. Click on the icons below to learn more!
@article{lenton2021ivy, title={Ivy: Templated deep learning for inter-framework portability}, author={Lenton, Daniel and Pardo, Fabio and Falck, Fabian and James, Stephen and Clark, Ronald}, journal={arXiv preprint arXiv:2102.02886}, year={2021} }