This is a simple demo to run yolov3/yolov4 with SophonSDK.
- init SophonSDK first: please refer to SophonSDK Tutorial
- Remember to use your own anchors, mask and classes number config values in
cpp/yolov3.hpp
andpython/configs/*.yml
- Deploy on arm SoC SE/SM, remember to set the environmental variables
- For INT8 BModel, do not forget the scale factors for input and output tensors
use scripts/0_prepare_test_data.sh
to download test images data and videos data to data/
cd scripts
bash ./0_prepare_test_data.sh
Bmodel must be compiled in SophonSDK dev docker.
use scripts/download.sh
to download the yolov4.weights and yolov4.cfg, then use scripts/gen_fp32bmodel.sh
to generate fp32 bmodel from .cfg
and .weights
, and .bmodel
will be saved in ../data/models/
:
cd scripts
bash ./download.sh
bash ./gen_fp32bmodel.sh
use bmnetd
in SophonSDK dev docker to generate other fp32 bmodels.
Follow the instructions in Quantization-Tools User Guide to generate int8 bmodel, the typical steps are:
- use
ufwio.io
to generate LMDB from images - use
bmnetd --mode=GenUmodel
to generate fp32 umodel from.cfg
and.weights
- use
calibration_use_pb quantize
to generate int8 umodel from fp32 umodel - use
bmnetu
to generate int8 bmodel from int8 umodel
Several bmodels converted from darknet yolov3/yolov4 trained on MS COCO are provided.
Download bmodels from here and put them in
data/models/
directory
模型文件 | 输入 | 输出 | anchors and masks |
---|---|---|---|
yolov3_416_coco_fp32_1b.bmodel | input: data, [1, 3, 416, 416], float32, scale: 1 | output: Yolo0, [1, 255, 13, 13], float32, scale: 1 output: Yolo1, [1, 255, 26, 26], float32, scale: 1 output: Yolo2, [1, 255, 52, 52], float32, scale: 1 |
YOLO_MASKS: [6, 7, 8, 3, 4, 5, 0, 1, 2] YOLO_ANCHORS: [10, 13, 16, 30, 33, 23, 30, 61, 62, 45,59, 119, 116, 90, 156, 198, 373, 326] |
yolov3_608_coco_fp32_1b.bmodel | input: data, [1, 3, 608, 608], float32, scale: 1 | output: Yolo0, [1, 255, 19, 19], float32, scale: 1 output: Yolo1, [1, 255, 38, 38], float32, scale: 1 output: Yolo2, [1, 255, 76, 76], float32, scale: 1 |
YOLO_MASKS: [6, 7, 8, 3, 4, 5, 0, 1, 2] YOLO_ANCHORS: [10, 13, 16, 30, 33, 23, 30, 61, 62, 45, 59, 119, 116, 90, 156, 198, 373, 326] |
yolov4_416_coco_fp32_1b.bmodel | input: data, [1, 3, 416, 416], float32, scale: 1 | output: Yolo0, [1, 255, 52, 52], float32, scale: 1 output: Yolo1, [1, 255, 26, 26], float32, scale: 1 output: Yolo2, [1, 255, 13, 13], float32, scale: 1 |
YOLO_MASKS: [0, 1, 2, 3, 4, 5, 6, 7, 8] YOLO_ANCHORS: [12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401] |
yolov4_608_coco_fp32_1b.bmodel | input: data, [1, 3, 608, 608], float32, scale: 1 | output: Yolo0, [1, 255, 76, 76], float32, scale: 1 output: Yolo1, [1, 255, 38, 38], float32, scale: 1 output: Yolo2, [1, 255, 19, 19], float32, scale: 1 |
YOLO_MASKS: [0, 1, 2, 3, 4, 5, 6, 7, 8] YOLO_ANCHORS: [12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401] |
yolov4_608_coco_int8_1b.bmodel | input: data, [1, 3, 608, 608], int8, scale: 127.986 | output: Yolo0, [1, 255, 76, 76], float32, scale: 0.0078125 output: Yolo1, [1, 255, 38, 38], float32, scale: 0.0078125 output: Yolo2, [1, 255, 19, 19], float32, scale: 0.0078125 |
YOLO_MASKS: [0, 1, 2, 3, 4, 5, 6, 7, 8] YOLO_ANCHORS: [12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401] |
Notes:
cpp/yolov3.hpp
uses "{ 12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401}" as the default anchors and 80 as the default classes number. Remember to modify the values for your own model.
For more detailed instructions, refer to yolov3.v4-HowTO.pdf .
- compile the application in SophonSDK dev docker
$ cd cpp/cpp_cv_bmcv_bmrt_postprocess
$ make -f Makefile.pcie # will generate yolo_test.pcie
- then put yolo_test.pcie and data dir on pcie host with bm1684
$ realpath ../../data/images/* > imagelist.txt
$ ./yolo_test.pcie image imagelist.txt ../../data/models/yolov4_416_coco_fp32_1b.bmodel 4 0
# USAGE:
# ./yolo_test.pcie image <image list> <bmodel file> <test count> <device id>
# ./yolo_test.pcie video <video list> <bmodel file> <test count> <device id>
- compile the application in SophonSDK dev docker
$ cd cpp/cpp_cv_bmcv_bmrt_postprocess
$ make -f Makefile.arm # will generate yolo_test.arm
- then put yolo_test.arm and data dir on SoC
$ realpath ../../data/images/* > imagelist.txt
$ ./yolo_test.arm image imagelist.txt ../../data/models/yolov4_416_coco_fp32_1b.bmodel 4 0
# USAGE:
# ./yolo_test.arm image <image list> <bmodel file> <test count> <device id>
# ./yolo_test.arm video <video list> <bmodel file> <test count> <device id>
Notes:For Python codes, create your own config file *.yml in
configs
based on the values ofENGINE_FILE
,LABEL_FILE
,YOLO_MASKS
,YOLO_ANCHORS
,OUTPUT_TENSOR_CHANNELS
for your model.
- for x86 with PCIe cards, the environment variable is set when
source envsetup_pcie.sh
, you need to install SAIL
# for example, python3.7 in docker
cd /workspace/lib/sail/python3/pcie/py37
pip3 install sophon-<x.y.z>-py3-none-any.whl
you also need to install other requirements
# for example, python3.7 in docker
pip3 install easydict
- for arm SoC, you need to set the environment variable:
# set the environment variable
export PATH=$PATH:/system/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/system/lib/:/system/usr/lib/aarch64-linux-gnu
export PYTHONPATH=$PYTHONPATH:/system/lib
you probably need to install NumPy, then you could use OpenCV and SAIL:
# for Debian 9, please specify numpy version 1.17.2
sudo apt update
sudo apt-get install python3-pip
sudo pip3 install numpy==1.17.2
you also need to install other requirements
sudo apt-get install -y libjpeg-dev zlib1g-dev
sudo pip3 install Pillow pyyaml easydict
finally you need to copy the prepared data/images
, data/videos
and data/models
in SophonSDK dev docker to ${yolov34}/data
directory.
$ cd python
$ python3 main.py # default: --cfgfile=configs/yolov3_416.yml --input=../data/images/person.jpg
#$ python3 main.py --cfgfile=<config file> --input=<image file path>
#$ python3 main.py --help # show help info