Skip to main content

This is the final submission post for the Summer Sidekick project.
I’m really glad I had the opportunity to work on it, have fun during the process, and interact with the community along the way.

All details about each component, the system architecture, lessons learned, challenges, and future improvements of the project are available here:
👉 GitHub Repository

Below, I’m sharing a short video (still without audio — I’m looking for an easy-to-use app for adding narration). The demo is fairly self-explanatory, but here’s a quick breakdown:

  1. Automatic food dispenser

    • At the start of the video, you can see the bowl empty.

    • When the bowl is empty, the algorithm waits about one minute before triggering the food dispenser.

    • Take a look at Live Camera 0.

    • You can also observe the monitoring status updating as the bowl becomes full.

  2. Pet activity detection

    • In the second part of the video, detection is triggered when the cat walks in front of the camera.

    • (Now I just need to motivate my daughter’s cat to cooperate 😅.)

    • When the cat goes out, the period of activity is available to ensure the last activity.

 

 

You can also see labeled images running on all models deployed on Metis.

 

 

For the full process, you can revisit my previous posts:

 

 

Awesome work on this ​@mcunha! In all honesty, I wasn’t sure you’d get every aspect of this great project up and running, but everything’s in there! 👍

And this is one of those kinds of projects I really need at home… my cats would really put it to the test 😆 


Thanks ​@Spanner ! The last two weeks were intense—I learned a lot and had a lot of fun during this challenge. The high level of the other projects was really motivating and pushed me to do my best! 


Hi ​@mcunha ,

Congratulations on your awesome submission! I’m really impressed with the customizations you’ve done and the possibilities are endless. On the vision pipeline front you’ve got your hands dirty with training a model from scratch with your own labelled data, extending a big model like YOLO11 with custom classes and running multiple models in parallel on Metis. On the hardware setup you’ve also left no stone unturned with custom boards and neat integrations into physical objects to “make things happen”. 

Thanks for the fantastic engagement and wish you look for future projects!

Best,

Radhika