Skip to main content
Question

Transforming the Attention mechanism into Convolutional

  • April 18, 2025
  • 2 replies
  • 83 views

Hello,
I know that axelera does not work in the attention mechanism. For example, can we convert the attention mechanism in the yolov11 architecture to convolution? Have you done any research or read on this subject? 

Does Softmax structure work in Axelera? (as far as I know, no) Is there a way to turn Softmax into a working structure?

2 replies

Hi Oguz

Interesting idea! This could be something really interesting to experiment with. Indeed, previous YOLO models have used pure-convolution architectures to achieve great results.

If you want to experiment building models like this, you can find our list of supported operators here: https://github.com/axelera-ai-hub/voyager-sdk/blob/59c4894b1bb4e7b17a6da92223bec5fdd2421fee/docs/reference/onnx-opset14-support.md#L4

With this, you can make sure that any customizations you make are compatible

 


  • Axelera Team
  • April 29, 2025

Hi ​@oguz ,

Good news, the team is working on Yolov11 support, including the attention mechanism.

Stay tuned!