I am trying to run some filtering models for my workflow on Axelera Metis. However, the `axcompile` command doesn’t allow for inputs with batch size (dim = 0) to be greater than 1. During model inference, that means that I have to pass inputs in a sequential order which introduces latency. The model is a custom model (not in model zoo) and not a vision model.
I have tried creating 4 instances of the same model running on each of the cores of Metis AIPU, but that isn’t possible due to SRAM memory limitations.
Is there a way to use Batch Size > 1 for Metis?
Question
How to use Batch Size > 1 for input during model compilation
Sign up
Already have an account? Login
Log in, or create an Axelera AI account
Log In or Register HereEnter your E-mail address. We'll send you an e-mail with instructions to reset your password.
