Google lets you test body tracking AI with "Move Mirror"
The body-tracking part of the "mirror" is powered by PoseNet, a machine learning model that detects and tracks 17 points on a person's body, 12 joints and 5 points on the head. The data is then matched to the picture database and the result is displayed next to your webcam feed.
Now before you say it's all part of Google's master plan to learn everything about us, Move Mirror uses Tensorflow.js, with its operations contained only to the browser of your device. No data is leaving your computer or smartphone and being sent to servers. The model is open source, so if you're interested, you can peek inside to see how it's done.
We tested it and despite having a hard time detecting a wrist, it seemed to work as advertised and provided multiple minutes of entertainment. The site has a handy "Make a GIF" button, so you can share the goofiness with your friends.
Things that are NOT allowed: