The Hailo Model Zoo provides pre-trained models for high-performance deep learning applications. Using the Hailo Model Zoo you can measure the full precision accuracy of each model, the quantized accuracy using the Hailo Emulator and measure the accuracy on the Hailo-8 device. Finally, you will be able to generate the Hailo Executable Format (HEF) binary file to speed-up development and generate high quality applications accelerated with Hailo-8. The Hailo Model Zoo also provides re-training instructions to train the models on custom datasets and models that were trained for specific use-cases on internal datasets.
Changelog
v2.0
- Updated to use Dataflow Compiler v3.16 (developer-zone) with TF version 2.5 which require CUDA11.2
- Updated to use HailoRT 4.6 (developer-zone)
- Retraining Dockers - each retraining docker has a corresponding README file near it. New retraining dockers:
- SSD
- YOLOX
- FCN
- New models:
- yolov5l
- Introducing Hailo Models, in-house pretrained networks with compatible Dockerfile for retraining
- yolov5m_vehicles (vehicle detection)
- tiny_yolov4_license_plates (license plate detection)
- lprnet (license plate recognition)
- Added new documentation to the YAML structure
v1.5
- Remove HailoRT installation dependency.
- Retraining Dockers
- YOLOv3
- NanoDet
- CenterPose
- Yolact
- New models:
- unet_mobilenet_v2
- Support Oxford-IIIT Pet Dataset
- New mutli-network example: detection_pose_estimation which combines the following networks:
- yolov5m_wo_spp_60p
- centerpose_repvgg_a0
- Improvements:
- nanodet_repvgg mAP increased by 2%
- New Tasks:
- hand_landmark_lite from MediaPipe
- palm_detection_lite from MediaPipe Both tasks are without evaluation module.
v1.4
- Update to use Dataflow Compiler v3.14.0 (developer-zone)
- Update to use HailoRT 4.3.0 (developer-zone)
- Introducing Hailo Models - in house pretrained networks with compatible Dockerfile for easy retraining:
- yolov5m_vehicles - vehicle detector based on yolov5m architecture
- tiny_yolov4_license_plates - license plate detector based on tiny_yolov4 architecture
- New Task: face landmarks detection
- tddfa_mobilenet_v1
- Support 300W-LP and AFLW2k3d datasets
- New features:
- Support compilation of several networks together - a.k.a multinets
- CLI for printing network information
- Retraining Guide:
- New training guide for yolov4 with compatible Dockerfile
- Modifications for yolov5 retraining
v1.3
- Update to use Dataflow Compiler v3.12.0 (developer-zone)
- New task: indoor depth estimation
- fast_depth
- Support NYU Depth V2 Dataset
- New models:
- resmlp12 - new architecture support (paper)
- yolox_l_leaky
- Improvements:
- ssd_mobilenet_v1 - in-chip NMS optimization (de-fusing)
- Model Optimization API Changes
- Model Optimization parameters can be updated using the networks' model script files (*.alls)
- Deprecated: quantization params in YAMLs
- Training Guide: new training guide for yolov5 with compatible Dockerfile
v1.2
- New features:
- YUV to RGB on core can be added through YAML configuration.
- Resize on core can be added through YAML configuration.
- Support D2S Dataset
- New task: instance segmentation
- yolact_mobilenet_v1 (coco)
- yolact_regnetx_800mf_20classes (coco)
- yolact_regnetx_600mf_31classes (d2s)
- New models:
- nanodet_repvgg
- centernet_resnet_v1_50_postprocess
- yolov3 - darkent based
- yolox_s_wide_leaky
- deeplab_v3_mobilenet_v2_dilation
- centerpose_repvgg_a0
- yolov5s, yolov5m - original models from link
- yolov5m_yuv - contains resize and color conversion on HW
- Improvements:
- tiny_yolov4
- yolov4
- IBC and Equalization API change
- Bug fixes
v1.1
- Support VisDrone Dataset
- New task: pose estimation
- centerpose_regnetx_200mf_fpn
- centerpose_regnetx_800mf
- centerpose_regnetx_1.6gf_fpn
- New task: face detection
- lightfaceslim
- retinaface_mobilenet_v1
- New models:
- hardnet39ds
- hardnet68
- yolox_tiny_leaky
- yolox_s_leaky
- deeplab_v3_mobilenet_v2
- Use your own network manual for YOLOv3, YOLOv4_leaky and YOLOv5.
v1.0
- Initial release
- Support for object detection, semantic segmentation and classification networks
Hailo provides different pre-trained models in ONNX / TF formats and pre-compiled HEF (Hailo Executable Format) binary file to execute on the Hailo-8 device. The models are divided to:
- PUBLIC MODELS which were trained on publicly available datasets.
- HAILO MODELS which were trained in-house for specific use-cases on internal datasets. Each Hailo Model is accompanied with retraining instructions.
Hailo also provides RETRAINING INSTRUCTIONS to train a network from the Hailo Model Zoo with custom dataset.
List of Hailo's benchmarks can be found in hailo.ai. In order to reproduce the measurements please refer to the following page.
- Install Hailo Dataflow Compiler and enter the virtualenv. In case you are not Hailo customer please contact hailo.ai
- Install HailoRT (optional). Required only if you want to run on Hailo-8. In case you are not Hailo customer please contact hailo.ai
- Clone the Hailo Model Zoo
git clone https://github.com/hailo-ai/hailo_model_zoo.git
- Run the setup script
cd hailo_model_zoo; pip install -e .
- Run the Hailo Model Zoo. For example, print the information of the MobileNet-v1 model:
python hailo_model_zoo/main.py info mobilenet_v1
NOTE: In case you are using the Hailo Software Suite please use the following path (<version> means 3.6/3.7/3.8):
<virtualenv_dir>/lib/python<version>/site-packages/hailo_model_zoo/main.py
For full functionality please see the GETTING STARTED page (full install instructions and usage examples). The Hailo Model Zoo is using the Hailo Dataflow Compiler for parsing, model optimization, emulation and compilation of the deep learning models. Full functionality includes:
- Parse: model translation of the input model into Hailo's internal representation.
- Profiler: generate profiler report of the model. The report contains information about your model and expected performance on the Hailo hardware.
- Quantize: optimize the deep learning model for inference and generate a numeric translation of the input model into a compressed integer representation. For further information please see our OPTIMIZATION page.
- Compile: run the Hailo compiler to generate the Hailo Executable Format file (HEF) which can be executed on the Hailo hardware.
- Evaluate: infer the model using the Hailo Emulator or the Hailo hardware and produce the model accuracy.
For further information about the Hailo Dataflow Compiler please contact hailo.ai.
The Hailo Model Zoo is released under the MIT license. Please see the LICENSE file for more information.
Please visit hailo.ai for support / requests / issues.