Skip to main content

Hi Team,

I’m working with the Voyager SDK (using docker container) and trying to build the axstreamer and gstaxstreamer components. The source files examples reference:

#include <axruntime/axruntime.hpp> (examples/axruntime/axruntime_example.cpp)

But I can’t find axruntime.hpp anywhere in the SDK.

I’m planning to run inference using a webcam as the input on my mac, with a pipeline that includes ffmpeg and mediamtx for streaming.

Is axruntime supposed to be part of the SDK repo, a separate dependency, or is there a script I missed?

Thanks!

Val.

Hi Val

If you run make examples then there it will build some examples for c++ applications.  I suggest you look at using the higher level interface AxInferenceNet which you will find much easier to integrate and also much more performant inference because it utilises OpenCL for preprocessing (if available) and pipelines all of the operations to improve utilisation.  AxInferenceNet is implemented using axruntime, so assuming make examples works then it has found axruntime headers using pkgconfig.

https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/examples/axinferencenet/axinferencenet_example.cpp is a good start, there are two other examples as well - one for a cascaded pipeline, and one for accessing raw tensor output.

A tutorial introducing AxInferenceNet is here :https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/docs/tutorials/axinferencenet.md

And reference docs are here : https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/docs/reference/axinferencenet.md 

Sam


Thanks Sam, I will give that a go!


Reply