I'm trying to figure out the best way to track hands as though a person in
front of a kiosk were using a mouse (or two).

My strategy so far is to grab an image when the Quartz composition launches
from a web cam. Using a trick I saw in PhotoBooth, you can then compare the
current frame of captured video to the background and find the difference --
that gives you a quick mask of something entering the screen.

Next, I would up contrast and saturation, to exaggerate differences. By
comparing a frame from a moment before with a current frame, I could then
discover motion.

The trick is to get the "most moved" area and perhaps the smallest area --
assuming that a finger is going to be smaller than a head, and is going to
have more movement (not always true -- but the user should get the hang of
it in short order).

So I would need to figure out, not only the difference between the prior
image and the current image for what has moved -- I need to use a false
color to get an idea that that tan blob over there (a hand) has moved more
than the blue jacket. I'm not completely sure how to do that. It sort of
overlaps some of the other color-lookup discussions, where you don't want to
check the color of each strand of hair, because lighting conditions can
quickly change -- you want to "blob" the median values of a region. AFTER
you discover the motion of a common region, you can track perhaps a
highlight of the most moved point (like a finger). Luminosity would be
valuable to use the Outline sketch effects to define regions, and then again
for highlighting that gives dimension.

I'm thinking out loud -- maybe someone has come up with a much easier
strategy to turn a camera into a mouse. I've seen these with video displays
at malls and I'm pretty sure they are using InfraRed on the camera to ignore
their own projection. I may need to get a real video camera and hack it to
grab only infrared -- I'd love advice on that.

I know there is some patch that allows for finding the pixel in a row or
column with the greatest or least value. With so many new patch's I've lost
track of it -- but, after I can somehow visually create an image with the
"most moved" regions, then I can find an x and y position and use that as
though it were a mouse.

CIKernel functions that do these operations I'm assuming would be faster --
so if anyone can point me to prior work or a way to do this, I'd love it.

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to