On Tue, April 13, 2010 06:07, John wrote:
> An example of motion detection is in the following youtube clip.
>
> http://www.youtube.com/watch?v=3jYFJw1dT18
>

Wow ! Cool clip !

The way I see this being done in LiVES is to implement data effects -
realtime effects in LiVES have the ability to produce data output instead
of changing the frame. So for example an edge detect data effect could
produce an array of pixel values representing edges. Then you would chain
this with another effect which takes an array as input.

OK so I have added a new feature request for this:
https://sourceforge.net/tracker/?func=detail&aid=2986359&group_id=64341&atid=507142


And effect chain enhancement is here:
https://sourceforge.net/tracker/?func=detail&aid=2800758&group_id=64341&atid=507142
(the basic idea would be to create effect chains which could then be
assigned to an effect key/mode), or applied as a single effect in the clip
editor or multitrack mode.

So you could imagine an effect chain like:
motion track -> data_light

where motion track produces concentric rectangles and data_light takes
rectangle corners and creates a glow effect.


Cheers,
Gabriel.



------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Lives-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/lives-users

Reply via email to