一:PT转ONNX
1. 首先克隆rknn修改后的ultralytics版本项目到本地
https://github.com/airockchip/ultralytics_yolov8
cd ultralytics-main
pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple
pip install -e .
主要是修改了源码的ultralytics/nn/modules/head.py和ultralytics/engine/exporter.py两个文件。
2. 使用修改后的ultralytics对pt模型进行模型转换,此处的format=rknn代表支持rknn后续的转换,而不是用onnx,一定要注意!!!
yolo export model=/your_path/best.pt format=rknn
二:ONNX转RKNN
1. 在linux x86上配置Toolkit环境,具体步骤如下链接:
https://docs.radxa.com/rock5/rock5b/app-development/rknn_install
相关链接:
# 下载 RKNN-Toolkit2 仓库
https://github.com/airockchip/rknn-toolkit2.git# 下载 RKNN Model Zoo 仓库
https://github.com/airockchip/rknn_model_zoo.git
2. 模型的转换只能在rknn-toolkit2中,因此Linux系统只能是linux x86,配置环境时注意python版本问题
- RKNN-Toolkit2 is not compatible with [RKNN-Toolkit](https://github.com/airockchip/rknn-toolkit)
- Currently only support on:
- Ubuntu 18.04 python 3.6/3.7
- Ubuntu 20.04 python 3.8/3.9
- Ubuntu 22.04 python 3.10/3.11
配置好环境后,可通过
python
import rknn
检查是否安装成功?!
3. 然后通过rknn_model_zoo里的YOLOv8的convert.py转换即可
import sys
from rknn.api import RKNNif __name__ == '__main__':
# onnx模型位置model_path = "/home/zhangh/RKNN_Docker/ultralytics_yolov8-main/ultralytics_yolov8-main/yolov8n.onnx"
# rknn导出位置output_path = "/home/zhangh/RKNN_Docker/ultralytics_yolov8-main/ultralytics_yolov8-main/weight/yolov8n.rknn"# Create RKNN objectrknn = RKNN(verbose=False)# Pre-process configprint('--> Config model')rknn.config(mean_values=[[0, 0, 0]], std_values=[[255, 255, 255]], target_platform='rk3588')print('done')# Load modelprint('--> Loading model')ret = rknn.load_onnx(model=model_path)if ret != 0:print('Load model failed!')exit(ret)print('done')# Build modelprint('--> Building model')ret = rknn.build(do_quantization=False)if ret != 0:print('Build model failed!')exit(ret)print('done')# Export rknn modelprint('--> Export rknn model')ret = rknn.export_rknn(output_path)if ret != 0:print('Export rknn model failed!')exit(ret)print('done')# Releaserknn.release()
可视化rknn模型