- California, USA
- https://www.linkedin.com/in/dheerajperi/
Highlights
- Pro
Stars
Cosmos is a world model development platform that consists of world foundation models, tokenizers and video processing pipeline to accelerate the development of Physical AI at Robotics & AV labs. C…
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
TAO Toolkit deep learning networks with PyTorch backend
functorch is JAX-like composable function transforms for PyTorch.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
LeViT a Vision Transformer in ConvNet's Clothing for Faster Inference
Visualizer for neural network, deep learning and machine learning models
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Pytorch CPP implementations of several deep learning papers
Motion Retargeting Video Subjects
NVIDIA's Deep Imagination Team's PyTorch Library
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
PyTorch implementation of SwAV https//arxiv.org/abs/2006.09882
PyTorch implementation of the REMIND method from our ECCV-2020 paper "REMIND Your Neural Network to Prevent Catastrophic Forgetting"
Usable Implementation of "Bootstrap Your Own Latent" self-supervised learning, from Deepmind, in Pytorch
[CVPR 2021] VirTex: Learning Visual Representations from Textual Annotations
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
PyTorch implementation of Contrastive Learning methods
PyTorch implementation of MoCo: https://arxiv.org/abs/1911.05722