Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

Remove extra Transpose layers from the generated graph #152

Open
wants to merge 1 commit into
base: A11_V1.3_dev
Choose a base branch
from

Conversation

Rjasuja
Copy link
Contributor

@Rjasuja Rjasuja commented Feb 17, 2022

Tranpose layers are added to every operator's input and output if layout
is NHWC
Instead it should be added only at the beginning and end of the graph

Signed-off-by: Ritul Jasuja [email protected]

@Nagamani71
Copy link
Contributor

Nagamani71 commented Feb 17, 2022

@Rjasuja could you please rebase the PR. It has "001-Add-Float16-support-for-Resize-Bilinear-and-Resize-N.patch" patch.
Also, could you please check below mentioned VTS test cases
TestGenerated/GeneratedTest#Test
mobilenet_quantized, mobilenet_224_gender_basic_fixed
Or verify changes with any object detection application

I think your changes will fail to reshape the tensor when model has multiple convolution nodes. (I could be wrong, i am unable to apply your patch and verify the changes)

Tranpose layers are added to every operator's input and output if layout
is NHWC
Instead it should be added only at the beginning and end of the graph

Signed-off-by: Ritul Jasuja <[email protected]>
if (!useNchw) { // No conversion needed if useNchw set
inputNode = transpose(NHWC_NCHW, inputNode);
transposed_nchw = true;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this code be shared by multiple operations? It seems like its needed in all operations, maybe we can have a function in the parent class?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants