Skip to content

Latest commit

 

History

History
79 lines (53 loc) · 2.6 KB

densenet-161-tf.md

File metadata and controls

79 lines (53 loc) · 2.6 KB

densenet-161-tf

Use Case and High-Level Description

This is a TensorFlow* version of densenet-161 model, one of the DenseNet group of models designed to perform image classification. The weights were converted from DenseNet-Keras Models. For details see repository, paper.

Specification

Metric Value
Type Classification
GFlops 14.128
MParams 28.666
Source framework TensorFlow*

Accuracy

Metric Value
Top 1 76.446%
Top 5 93.228%

Input

Original Model

Image, name: Placeholder , shape: [1x224x224x3], format: [BxHxWxC], where:

- B - batch size
- H - image height
- W - image width
- C - number of channels

Expected color order: RGB. Mean values: [123.68, 116.78, 103.94], scale factor for each channel: 58.8235294

Converted Model

Image, name: Placeholder, shape: [1x3x224x224], format: [BxCxHxW], where:

  • B - batch size
  • C - number of channels
  • H - image height
  • W - image width

Expected color order: BGR.

Output

Original Model

Floating point values in range [0, 1], which represent probabilities for classes in a dataset. Name: densenet161/predictions/Reshape_1.

Converted Model

Floating point values in a range [0, 1], which represent probabilities for classes in a dataset. Name: densenet161/predictions/Reshape_1/Transpose, shape: [1, 1, 1, 1000].

Download a Model and Convert it into Inference Engine Format

You can download models and if necessary convert them into Inference Engine format using the Model Downloader and other automation tools as shown in the examples below.

An example of using the Model Downloader:

python3 <omz_dir>/tools/downloader/downloader.py --name <model_name>

An example of using the Model Converter:

python3 <omz_dir>/tools/downloader/converter.py --name <model_name>

Legal Information

The original model is distributed under the Apache License, Version 2.0. A copy of the license is provided in APACHE-2.0-TF-DenseNet.txt.