Hello!
I am facing some problems tring to run a network inference on the Metis M.2 with Raspberry Pi via Docker. To try the inference I have followed the steps in the guide (https://support.axelera.ai/hc/en-us/articles/26362016484114-Bring-up-Voyager-SDK-in-Raspberry-Pi-5) but after trying to run the inference.py, I face the following errors:
(venv) root@iak:/home/voyager-sdk# ./inference.py yolov8s-coco-onnx ./media/traffic1_480p.mp4
INFO : Using device metis-1:1:0
WARNING : Failed to get OpenCL platforms : clGetPlatformIDs failed: PLATFORM_NOT_FOUND_KHR
WARNING : Please check the documentation for installation instructions
INFO : Default OpenGL backend gl,3,3 overridden, using gles,3,1
INFO : Network type: NetworkType.SINGLE_MODEL
INFO : Input
INFO : └─detections
Stream Paused: 57%|██████████████████████████████████████████████████████▊ | 4/7 [00:00<00:00, 5.56/s][ERROR][axeWaitForCommandList]: Uio wait kernel failed with return code -1406.
[ERROR][axeCommandQueueExecuteCommandListsAsync]: Waiting for command lists failed: 0x70010001.
terminate called after throwing an instance of 'std::runtime_error'
what(): axr_run_model failed with Error at zeCommandQueueExecuteCommandLists(cmdqueue, n_cmdlists, cmdlists, nullptr): cmdqueue_run_cmdlists: 309: Exit with error code: 0x70010001 : ZE_RESULT_ERROR_NOT_AVAILABLE
Aborted (core dumped)
(venv) root@iak:/home/voyager-sdk#
Before I have tried to use the Accelerator on my laptop, also via docker and it ended with the same results. What does that error mean and how to troubleshoot it? May it be the accelerator firmware issue?
I’ve also tried the axdevice --refresh, however, it also didn’t help.


