Skip to main content

Hello,

I’m deploying a model onnx (opset 17) and got a large bunch of warnings of unsupported operations. One of them is the unsupported Transpose:

WARNING : Configuration of node '/backbone/3/0/Transpose' may not be supported. 'Transpose' parameters:
WARNING : perm: :0, 3, 1, 2]
WARNING : data: Metadata(shape=(1, 48, 64, 160), is_constant=False)
WARNING : transposed: Metadata(shape=(1, 160, 48, 64), is_constant=False)
WARNING : Unsatisfied constraint: perm == =0, 1, 2, 3]

I guess the rule should have been perm != =0, 1, 2, 3] because here it is requesting a non no-op operation perm = =0, 3, 1, 2].

I think this is also causing later the error

ERROR   : RuntimeError: The expanded size of the tensor (192) must match the existing size (160) at non-singleton dimension 1.  Target sizes: :1, 192, 256, 160].  Tensor sizes: :1, 160, 1, 1]

as it looks that indeed the transpose has not been applied.

Could someone shade some light on this behavior?

Thanks in advance

 

In your documentation it is written:

So if my reading is correct you’re indeed not supporting Transpose other than no-transpose?


Hi ​@npi!

Yeah, as far as I know, any other permutation is unsupported at the moment, so it needs to be e0, 1, 2, 3]. Hence the mismatch error.

Possibly it could be done in pre-processing at the start of the model?


Hi ​@Spanner,

The model has (by default) LayerNorm which needs NHWC and Conv2d needs NCHW.
So I guess I’m stuck or I need to retrain the model with BatchNorm2d which may be less efficient.

Thanks anyway for your suggestion


Hi ​@npi , thanks for your question. LayerNorm is currently not supported, but the compiler team is working on it.
We’ll keep you posted about the progress.
 


Ah, okay. This would make a good feature request in the Launchpad, ​@npi!


Thanks ​@Bram Verhoef 

As the compiler team is working on LayerNorm I would guess they also work on unconstrained Transpose because NHWC format would probably make the LayerNorm much faster and likely much easier to implement.

Looking forward to hear from you.

Best Regards


Reply