-
Notifications
You must be signed in to change notification settings - Fork 44
2. Build neural network inference library
Johannes Czech edited this page Sep 4, 2021
·
8 revisions
At the time of writing there are three back-ends available to run the neural network inference with:
Back-end | GPU Support | CPU Support | Inference Speed | Effort to install |
---|---|---|---|---|
TensorRT (default) | ✔️ | ❌ | 🔥🔥🔥🔥 | |
OpenVino | (:heavy_check_mark:) not tested yet | ✔️ | 🔥🔥🔥 | |
MXNet | ✔️ | ✔️ | 🔥🔥 | |
Torch | ✔️ | ✔️ | 🔥 |
Next part:
- Home
- Installation
- Engine settings
- Command line usage
- Build instructions
- Programmer's guide
- Setting up CrazyAra as a Lichess BOT
- Neural network
- Strength evaluation
- FAQ
- Stockfish 10 - Crazyhouse Self Play
- Paper Instructions