A pair of AI developers turned a $10 webcam into a motion-tracking system. It’s like a DIY Kinect with a Google brain. And best of all they named it Skeletron.
Motion-tracking usually requires expensive cameras, high-end computers, and someone to wear a skin-tight bodysuit with those silly plastic balls Velcroed all over. With this project though, all you need is that webcam you’ve had buried in your junk drawer for half a decade:
Using AI for motion-tracking isn’t exactly new; Microsoft’s Kinect was an early example of a consumer product that utilized machine learning. However, it required several sensors and cost $150 – not to mention it’s dead now.
Fleischer and Ayalon’s AI is experimental, but it’s still exciting to see real-time motion tracking happen at all with cheap hardware, especially considering it outputs to a video-game engine already popular with AR and VR developers.
A software solution for motion-tracking based on open-source AI and dirt-cheap webcams could revolutionize a myriad of industries, not just gaming. It could enable medical professionals analyze movement and gait without the need for highly specialized hardware, which could improve the study of neurological disorders and help with orthopedic rehabilitation.
And, if you watched the video all the way to the end, you’ll see it’s also a cool way to visualize dance moves in real time. It’s always a party when you bring TensorFlow and a webcam.