This is the final submission post for the Summer Sidekick project.
I’m really glad I had the opportunity to work on it, have fun during the process, and interact with the community along the way.
All details about each component, the system architecture, lessons learned, challenges, and future improvements of the project are available here:
👉 GitHub Repository
Below, I’m sharing a short video (still without audio — I’m looking for an easy-to-use app for adding narration). The demo is fairly self-explanatory, but here’s a quick breakdown:
-
Automatic food dispenser
-
At the start of the video, you can see the bowl empty.
-
When the bowl is empty, the algorithm waits about one minute before triggering the food dispenser.
-
Take a look at Live Camera 0.
-
You can also observe the monitoring status updating as the bowl becomes full.
-
-
Pet activity detection
-
In the second part of the video, detection is triggered when the cat walks in front of the camera.
-
(Now I just need to motivate my daughter’s cat to cooperate 😅.)
-
When the cat goes out, the period of activity is available to ensure the last activity.
-
You can also see labeled images running on all models deployed on Metis.
For the full process, you can revisit my previous posts: