From e7c014d701416dd014c305cb0d94c1781d8f9490 Mon Sep 17 00:00:00 2001 From: joaqo Date: Fri, 12 Feb 2021 00:30:01 -0300 Subject: [PATCH 01/14] Remove motmetrics (which adds pandas) and opencv as dependencies Add them as extra dependencies, adjust code to return pretty error messages when imports fail, and update documentation. --- README.md | 10 +++++++++- demos/alphapose/README.md | 5 +++-- demos/detectron2/README.md | 5 +++-- demos/motmetrics4norfair/README.md | 7 ++++--- demos/openpose/README.md | 5 +++-- demos/yolov4/README.md | 5 +++-- norfair/drawing.py | 7 +++++-- norfair/metrics.py | 6 +++++- norfair/utils.py | 11 +++++++++++ norfair/video.py | 7 +++++-- pyproject.toml | 12 ++++++++---- 11 files changed, 59 insertions(+), 21 deletions(-) diff --git a/README.md b/README.md index 52e213d5..28a8831a 100644 --- a/README.md +++ b/README.md @@ -20,12 +20,20 @@ Norfair is built, used and maintained by [Tryolabs](https://tryolabs.com). ## Installation -Norfair currently supports Python 3.6+. +Norfair currently supports Python 3.7+. +For pure python version that works anywhere install as: ```bash pip install norfair ``` +For more features install as: +```bash +pip install norfair[metrics] # Supports running MOT metrics evaluation +pip install norfair[video] # Adds several video helper features running on OpenCV +pip install norfair[metrics,video] # Everything included +``` + ## How it works Norfair works by estimating the future position of each point based on its past positions. It then tries to match these estimated positions with newly detected points provided by the detector. For this matching to occur, Norfair can rely on any distance function specified by the user of the library. Therefore, each object tracker can be made as simple or as complex as needed. diff --git a/demos/alphapose/README.md b/demos/alphapose/README.md index fc2439a7..c8b51a0f 100644 --- a/demos/alphapose/README.md +++ b/demos/alphapose/README.md @@ -4,8 +4,9 @@ An example of how to integrate Norfair into the video inference loop of a pre ex ## Instructions -1. [Follow the instructions](https://github.com/MVIG-SJTU/AlphaPose/tree/pytorch#installation) to install the Pytorch version of AlphaPose. -2. Apply this diff to this [commit](https://github.com/MVIG-SJTU/AlphaPose/commit/ded84d450faf56227680f0527ff7e24ab7268754) on AlphaPose and use their [video_demo.py](https://github.com/MVIG-SJTU/AlphaPose/blob/ded84d450faf56227680f0527ff7e24ab7268754/video_demo.py) to process your video. +1. Install norfair with `pip install norfair[video]`. +2. [Follow the instructions](https://github.com/MVIG-SJTU/AlphaPose/tree/pytorch#installation) to install the Pytorch version of AlphaPose. +3. Apply this diff to this [commit](https://github.com/MVIG-SJTU/AlphaPose/commit/ded84d450faf56227680f0527ff7e24ab7268754) on AlphaPose and use their [video_demo.py](https://github.com/MVIG-SJTU/AlphaPose/blob/ded84d450faf56227680f0527ff7e24ab7268754/video_demo.py) to process your video. ```diff diff --git a/dataloader.py b/dataloader.py diff --git a/demos/detectron2/README.md b/demos/detectron2/README.md index 026bace4..20a8786d 100644 --- a/demos/detectron2/README.md +++ b/demos/detectron2/README.md @@ -6,8 +6,9 @@ Simplest possible example of tracking. Based on [Detectron2](https://github.com/ Assuming Norfair is installed: -1. [Follow the instructions](https://detectron2.readthedocs.io/tutorials/install.html) to install Detectron2. -2. Run `python detectron2_cars.py`. For the demo, we are using [this traffic footage](https://www.youtube.com/watch?v=aio9g9_xVio). +1. Install norfair with `pip install norfair[video]`. +2. [Follow the instructions](https://detectron2.readthedocs.io/tutorials/install.html) to install Detectron2. +3. Run `python detectron2_cars.py`. For the demo, we are using [this traffic footage](https://www.youtube.com/watch?v=aio9g9_xVio). ## Explanation diff --git a/demos/motmetrics4norfair/README.md b/demos/motmetrics4norfair/README.md index 814defcc..93511e6b 100644 --- a/demos/motmetrics4norfair/README.md +++ b/demos/motmetrics4norfair/README.md @@ -3,7 +3,8 @@ Demo on how to evaluate a Norfair tracker on the [MOTChallenge](https://motchall ## Instructions -1. Download the [MOT17](https://motchallenge.net/data/MOT17/) dataset running +1. Install norfair with `pip install norfair[metrics,video]`. +2. Download the [MOT17](https://motchallenge.net/data/MOT17/) dataset running ```bash curl -O https://motchallenge.net/data/MOT17.zip # To download Detections + Ground Truth + Images (5.5GB) @@ -18,7 +19,7 @@ unzip MOT17Labels.zip Given that the ground truth files for the testing set are not publicly available, you will only be able to use motmetrics4norfair with the training set. -2. Display the motmetrics4norfair instructions: +3. Display the motmetrics4norfair instructions: ```bash python motmetrics4norfair.py --help -``` \ No newline at end of file +``` diff --git a/demos/openpose/README.md b/demos/openpose/README.md index 51ff7be7..ed332b86 100644 --- a/demos/openpose/README.md +++ b/demos/openpose/README.md @@ -4,8 +4,9 @@ Demo for extrapolating detections through skipped frames. Based on [OpenPose](ht ## Instructions -1. Install [OpenPose version 1.4](https://github.com/CMU-Perceptual-Computing-Lab/openpose/releases/tag/v1.4.0). -2. Run `python openpose_extrapolation.py`. +1. Install norfair with `pip install norfair[video]`. +2. Install [OpenPose version 1.4](https://github.com/CMU-Perceptual-Computing-Lab/openpose/releases/tag/v1.4.0). +3. Run `python openpose_extrapolation.py`. ## Explanation diff --git a/demos/yolov4/README.md b/demos/yolov4/README.md index cbcd7e61..a2960edf 100644 --- a/demos/yolov4/README.md +++ b/demos/yolov4/README.md @@ -4,8 +4,9 @@ Simplest possible example of tracking. Based on [pytorch YOLOv4](https://github. ## Instructions -1. Clone [pytorch YOLOv4](https://github.com/Tianxiaomo/pytorch-YOLOv4/tree/master) and download the [weights](https://drive.google.com/open?id=1wv_LiFeCRYwtpkqREPeI13-gPELBDwuJ) published on the repo into your local clone of the repo. -2. Copy `yolov4demo.py` into your local clone of the repo and run `python yolov4demo.py