July 2022 - August 2022
Interactive body tracking app to encourage movement in children. Created in collaboration with a summer kids group in Coventry, UK.
Unity, machine learning, body tracking
This project took place alongside a series of kids workshops, the aim being to iteratively develop each week, allowing the kids to test it and give feedback. This unique approach to development was invaluable and ended up changing the course of the app for the better, leaning more into the explorative, sandbox type gameplay.
This was also my first time working with children. The sessions consisted of a couple hours of dance and movement where I was a support staff, then a couple hours of digital and game design work. I led workshops using Bloxels, a fun game-making kit that allows the kids to design their characters and worlds and scan them with a phone camera to use in the game. I also supervised VR sessions where they would play the rhythm game Beat Saber.
The aim of this project was to create a Just Dance-like body tracking system that was as accessible as possible. This meant not using external hardware such as an Xbox Kinect. I settled on using a machine learning model from NatML called MoveNet. MoveNet uses machine learning to predict 17 distinct points on the users body using just a video feed. I then used these tracked points to allow the user to interact with the game entirely physically, not needing to press any buttons or tap any screens.
I developed two distinct parts. First was a pose matching game which was designed to run the users through a series of pre-defined movements. This worked okay but the lack of precision inherent in the software caused issues and could sometimes frustrate the user. I also created a 2D avatar overlay with simple particle and trail effects on the hands, feet, and head. The kids found this system fun and they all naturally started to move and dance upon seeing the effects. They would try to throw the particles around and draw shapes with the trails.