Interactively explore safetensors
and onnx
networks in your CLI.
Get nnli
🎉
From Cargo
cargo install nnli
From Github
git clone https://github.com/drbh/nnli.git
cd nnli
cargo install --path .
Check version
nnli --version
Print a local model
nnli print --path <PATH TO MODEL FILE OR MODEL ID>
# if the model is in your HF cache
nnli print --path microsoft/Phi-3-mini-4k-instruct
# when there is more than one revision, specify the revision
nnli print --path microsoft/Phi-3-mini-4k-instruct@d269012bea6fbe38ce7752c8940fea010eea3383
# or the full path
nnli print --path ~/.cache/huggingface/hub/models--microsoft--Phi-3-mini-4k-instruct/snapshots/d269012bea6fbe38ce7752c8940fea010eea3383/
This app is a work in progress, and there is a lot of room for improvement on both the code and user experience (UX) fronts.
features
- read onnx models via
candle-onnx
- display nodes in tui via
ratatui
- extract and display node details in pane
- improve color schema and ui layout
- improve navigation
- upload to crates.io
- better install instructs
- build releases
- improve details output to show all relevant data
- highligh I/O of node on left
- add command to show only unique operations
- add commands to see other useful stats
- support
safetensors
files - support file or directory input as
--path