Lists (6)
Sort Name ascending (A-Z)
Audio
All repos connected to audio processing like stt/tts and so onGraphs and more
Interpretability
Research dedicated to figuring out how Language Models work internaly - what weights store what information, etc.NLP
NLP, chatbots and moreRL
All repos that use RL.Web Parsing
Web Page classification, elements classification, boilerplate removal, etc.Starred repositories
The official code repo and data hub of top_nsigma sampling strategy for LLMs.
LLM Frontend for Power Users.
Code repository for "The Clock and the Pizza: Two Stories in Mechanistic Explanation of Neural Networks"
[NeurIPS 2024 Spotlight] Code and data for the paper "Finding Transformer Circuits with Edge Pruning".
An open source deep research clone. AI Agent that reasons large amounts of web data extracted with Firecrawl
The hub for EleutherAI's work on interpretability and learning dynamics
Magic to turn Cursor/Windsurf as 90% of Devin
SLOP Detector and analyzer based on dictionary for shareGPT JSON and text
Unofficial implementation of Titans, SOTA memory for transformers, in Pytorch
Arrakis is a library to conduct, track and visualize mechanistic interpretability experiments.
Dockerized FastAPI wrapper for Kokoro-82M text-to-speech model w/CPU ONNX and NVIDIA GPU PyTorch support, handling, and auto-stitching
Use PEFT or Full-parameter to finetune 450+ LLMs (Qwen2.5, InternLM3, GLM4, Llama3.3, Mistral, Yi1.5, Baichuan2, DeepSeek-R1, ...) and 150+ MLLMs (Qwen2.5-VL, Qwen2-Audio, Llama3.2-Vision, Llava, I…
An OAI compatible exllamav2 API that's both lightweight and fast
Convert PDF to markdown + JSON quickly with high accuracy
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Convenient wrapper for fine-tuning and inference of Large Language Models (LLMs) with several quantization techniques (GTPQ, bitsandbytes)
A fast inference library for running LLMs locally on modern consumer-class GPUs
Modeling, training, eval, and inference code for OLMo
[ICLR 2025] Monet: Mixture of Monosemantic Experts for Transformers