In this paper, we propose a multi-level similarity model under a Siamese framework for robust Thermal Infrared (TIR) object tracking. Specifically, we compute different pattern similarities using the proposed multi-level similarity network. One of them focuses on the global semantic similarity and the other computes the local structural similarity of the TIR object. These two similarities complement each other and hence enhance the discriminative capacity of the network for handling distractors. In addition, we design a simple while effective relative entropy based ensemble subnetwork to integrate the semantic and structural similarities. This subnetwork can adaptive learn the weights of the semantic and structural similarities at the training stage. Paper
- Download the proposed TIR training dataset from here.
- [News] We have extended this dataset to a bigger TIR object tracking training dataset.
- Download the tracking raw results and several trained models from baidu disk using psd: b0w0 or dubox using psd:sxwq.
- Prerequisites: Ubuntu 14, Matlab R2017a, GTX1080, CUDA8.0.
- Download our trained models and put them into the
src/tracking/pretrained
folder . - Run the
run_demo.m
insrc/tracking
folder to test a TIR sequence using a default model. - Test other TIR sequences, please download the PTB-TIR dataset from here.
- Preparing your training data like that in here. Noting that preparing the TIR training data uses the same format and method as the above.
- Configure the path of training data in
src/training/env_path_training.m
. - Run
src/training/run_experiment_MLSSNet.m
. to train the proposed network. - The network architecture and trained models are saved in
src/training/data-MLSSNet-TIR
folder.
If you use the code or dataset, please consider citing our paper.
@article{liu2020learning,
title={Learning deep multi-level similarity for thermal infrared object tracking},
author={Liu, Qiao and Li, Xin and He, Zhenyu and Fan, Nana and Yuan, Di and Wang, Hongpeng},
journal={IEEE Transactions on Multimedia},
year={2020}
}
Feedbacks and comments are welcome! Feel free to contact us via [email protected] or [email protected]