hi, pretty excited to be selected among the 10 community projects for the challenge.
since generally pretty edgeAI pilled i wanna toy around to make a proof of concept for a small desk appliance upgrade. i will try to make a hand-pose-detection model run on the metis to trigger with certain hand movements/poses eg a screenshot (which later on could be processed fully offline with OCR on the metis as well). the assumptions i want to test is that i imagine this to be an intuitive interface for certain ‘super commands’.
since there is a similar project for home automation cases, i want to focus on improving a work desk environment. though i don’t know whether i cover more than 1 or 2 trigger within the challenge’s time horizon, if the assumption holds true it seems like a fun project to further pursue, since voice to text and other modalities could be incorporated as well. lets see
but first steps now are to familiarise myself with the SDK and try to get a hand key point detection model run on the metis.
open for recommendations and ideas of course(:

