p5js link

This project tracks hand gestures through a webcam and controls a color panel on screen.

The system combines a trained regression model for openness and a heuristic geometric calculation for hue.

It runs in real time—drawing keypoints, predicting openness continuously, and updating the color block as you move your hand.

ef303b000635ffce9d0fc2825fa043da.mp4

At first, I wondered why I needed a model for openness when I could just calculate distances between keypoints. But after testing, I realized that direct geometric formulas break easily—especially when part of the hand goes out of the frame. The hue (which is calculated directly from a few points) often stops updating once the wrist or fingertips disappear. In contrast, the trained model learns a kind of “pattern memory.” Even when only part of the hand is visible, it can still infer how open the hand probably is. That gives it a sense of continuity and makes the transition much smoother.

However, during training I noticed the loss stayed quite high, even with a large number of samples. The predictions still followed the general trend (open → bright, closed → dark), but not in a very precise numeric way. I’m not entirely sure why—maybe the features were too noisy or the labeling wasn’t perfectly consistent. Still, the model performs reasonably well for interaction and feels natural to use.

image.png

image.png