- 1. Description
- 2. Current Support Platform
- 3. Pretrained Model
- 4. Convert to RKNN
- 5. Python Demo
- 6. Android Demo
- 7. Linux Demo
- 8. Expected Results
The model used in this example comes from the following open source projects:
https://github.com/airockchip/YOLOX
RK3566, RK3568, RK3588, RK3562, RK1808, RV1109, RV1126
Download link:
Download with shell command:
cd model
./download_model.sh
Note: The model provided here is an optimized model, which is different from the official original model. Take yolox_s.onnx as an example to show the difference between them.
- The comparison of their output information is as follows. The left is the official original model, and the right is the optimized model. As shown in the figure, the original one output is split into three outputs.
- We remove the subgraph following the three concat nodes in the model, and keep the outputs of these three concat nodes([1,85,80,80],[1,85,40,40],[1,85,20,20]).
Usage:
cd python
python convert.py <onnx_model> <TARGET_PLATFORM> <dtype(optional)> <output_rknn_path(optional)>
# such as:
python convert.py ../model/yolox_s.onnx rk3588
# output model will be saved as ../model/yolox.rknn
Description:
<onnx_model>
: Specify ONNX model path.<TARGET_PLATFORM>
: Specify NPU platform name. Support Platform refer [here](#2 Current Support Platform).<dtype>(optional)
: Specify asi8
,u8
orfp
.i8
/u8
for doing quantization,fp
for no quantization. Default isi8
.<output_rknn_path>(optional)
: Specify save path for the RKNN model, default save in the same directory as ONNX model with nameyolox.rknn
Usage:
cd python
# Inference with PyTorch model or ONNX model
python yolox.py --model_path <pt_model/onnx_model> --img_show
# Inference with RKNN model
python yolox.py --model_path <rknn_model> --target <TARGET_PLATFORM> --img_show
Description:
-
<TARGET_PLATFORM>
: Specify NPU platform name. Support Platform refer [here](#2 Current Support Platform). -
<pt_model / onnx_model / rknn_model>
: Specify the model path.
Note: RK1808, RV1109, RV1126 does not support Android.
Please refer to the Compilation_Environment_Setup_Guide document to setup a cross-compilation environment and complete the compilation of C/C++ Demo.
Note: Please replace the model name with yolox
.
With device connected via USB port, push demo files to devices:
adb root
adb remount
adb push install/<TARGET_PLATFORM>_android_<ARCH>/rknn_yolox_demo/ /data/
adb shell
cd /data/rknn_yolox_demo
export LD_LIBRARY_PATH=./lib
./rknn_yolox_demo model/yolox.rknn model/bus.jpg
-
After running, the result was saved as
result.png
. To check the result on host PC, pull back result referring to the following command:adb pull /data/rknn_yolox_demo/result.png
-
Output result refer Expected Results.
Please refer to the Compilation_Environment_Setup_Guide document to setup a cross-compilation environment and complete the compilation of C/C++ Demo.
Note: Please replace the model name with yolox
.
- If device connected via USB port, push demo files to devices:
adb push install/<TARGET_PLATFORM>_linux_<ARCH>/rknn_yolox_demo/ /userdata/
- For other boards, use
scp
or other approaches to push all files underinstall/<TARGET_PLATFORM>_linux_<ARCH>/rknn_yolox_demo/
touserdata
.
adb shell
cd /userdata/rknn_yolox_demo
export LD_LIBRARY_PATH=./lib
./rknn_yolox_demo model/yolox.rknn model/bus.jpg
-
After running, the result was saved as
result.png
. To check the result on host PC, pull back result referring to the following command:adb pull /userdata/rknn_yolox_demo/result.png
-
Output result refer Expected Results.
This example will print the labels and corresponding scores of the test image detection results, as follows:
bus @ (91 138 549 427) 0.933
person @ (105 234 220 538) 0.901
person @ (210 241 284 506) 0.878
person @ (475 236 560 520) 0.824
person @ (79 327 118 518) 0.508
- Note: Different platforms, different versions of tools and drivers may have slightly different results.