Realtime color tracking
People's imagination, inspired by science fiction, has historically lead to the creation of all sorts of technological advancements.
The movie Minority Report (2002) amazed us all with a touch free interaction.
At a time when OpenCV was in it's early years, and before Kinect (released 2010)we challenged ourselves to wonder if we could create interfaces that one could touch and use to click or drag items around a screen, and to create them with web technology.
This started by creating a control mechanism that had multiple points to recognize each finger, then leveraged a control that would start sensing color in an image.
An image sample control was a circle of color sampling points, both inner and outer sampling. Here we can see the dot on the left was sampling for the yellow in the tape.
If the sensors on the right side of the circle were no longer the sampled color, it would push the circle left.
By creating a group of these control points we could create a relationship between fingers and track distances.
We could then detect if they were close enough to each other within a threshold and use that to trigger a click or a start drag event (and later a stop drag event when the fingers were further apart).
Ultimately this exploration wasn't feasible in the larger community because of:
the dependency on a webcam, which were low resolution at the time, if they were available
for the high need of contrast within a space
proper and consistent lighting.
Legacy lab experimentation (older than 2020)
Lab descriptions and details currently not available online, please reach out to discuss.