Skip to main content

Annoyingly I’ve not yet used the new hardware Axelera shipped to us, but I will move over soon…

If you have followed my other posts, you will have seen that I got a basic webapp running using OpenCV. At the time that was just very simple HTML5 App.

The whole purpose of this is to check the form of the workout is correct, and count the reps.

Generally, all I need to do is work out an angle of 3 chosen key points, and simply track the range for “down rep” and the range for “up rep”, that allows me to count reps and track the range of motion. 

Then I can compare left and right side, are they nearly equal? If yes, that means the form must be strong. If they start to drift, that could show that one side is getting tired - time to stop.

To simplify he magic this is only simple Pythagoras and a state machines, however I have still decided to port the app to React so everything is processed client side, then the server just worries about inference and supplying an API endpoint for the data.

This means the Rockchip board is just an inference server, it removes a little bit of pressure from it. Additionally, I hope it means my React frontend will stay the same, and I can change the OpenCV backend for the Voyager Implementation. The benefit of that is that I can still tweak the app even when I cannot be with the Axelera hardware.

Question

Is it feasible to install the Voyager SDK on my Mac to help with development? I’m assuming no because there is no Metis hardware there!

Love this approach for its modularity and flexibility!

Hmm, I think your could install the SDK on the Mac for things like browsing code, editing pipelines - I can’t see why not anyway. But as you say, it wouldn’t be able to run any inference without the Metis hardware.

Be interested to hear how that works out if you try it!


Reply