Skip to main content
Question

Trying to build the axstreamer and gstaxstreamer components

  • August 6, 2025
  • 2 replies
  • 49 views

Hi Team,

I’m working with the Voyager SDK (using docker container) and trying to build the axstreamer and gstaxstreamer components. The source files examples reference:

#include <axruntime/axruntime.hpp> (examples/axruntime/axruntime_example.cpp)

But I can’t find axruntime.hpp anywhere in the SDK.

I’m planning to run inference using a webcam as the input on my mac, with a pipeline that includes ffmpeg and mediamtx for streaming.

Is axruntime supposed to be part of the SDK repo, a separate dependency, or is there a script I missed?

Thanks!

Val.

2 replies

  • Axelera Team
  • August 7, 2025

Hi Val

If you run make examples then there it will build some examples for c++ applications.  I suggest you look at using the higher level interface AxInferenceNet which you will find much easier to integrate and also much more performant inference because it utilises OpenCL for preprocessing (if available) and pipelines all of the operations to improve utilisation.  AxInferenceNet is implemented using axruntime, so assuming make examples works then it has found axruntime headers using pkgconfig.

https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/examples/axinferencenet/axinferencenet_example.cpp is a good start, there are two other examples as well - one for a cascaded pipeline, and one for accessing raw tensor output.

A tutorial introducing AxInferenceNet is here :https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/docs/tutorials/axinferencenet.md

And reference docs are here : https://github.com/axelera-ai-hub/voyager-sdk/blob/release/v1.3/docs/reference/axinferencenet.md 

Sam


  • Author
  • Cadet
  • August 7, 2025

Thanks Sam, I will give that a go!