On Mon, Mar 14, 2016 at 8:36 AM, Sergey Sharybin <[email protected]> wrote:
> Hi, > > There is no difference between image sequence and video file from the > Blender's tracker point of view, it's all being handled on a different > level. Tracker always works with frames. > > Currently Blender doesn't deal with real-time video streams. > > On Sun, Mar 13, 2016 at 10:36 PM, Miroslav Krajíček < > [email protected]> > wrote: > > > Hi, > > > > my name is Miroslav Krajicek, I am currently studying computer graphics > at > > Masaryk University (Brno, Czech Republic). > > > > I am interested in motion capture and image processing. As bachelor > thesis > > I created motion capture tool for webcams WebCamCap > > <https://github.com/kaajo/WebCamCap/> . For lens distortion correction i > > used OpenCV. Currently i am working on autonomous car (part time job) > where > > calibration is really important for camera SLAM. > > > > I compiled source code and found where lens settings are. I have a > question > > regarding to process of a calibration: Do you think users would like to > > calibrate camera from sequence of photos (or recorded video) ? Until > now I > > was used to "live" video calibration but not all cameras used by users > have > > that option. > > _______________________________________________ > > Bf-committers mailing list > > [email protected] > > http://lists.blender.org/mailman/listinfo/bf-committers > > Wasn't there a real time video project a few years back? I think it had to do with the motion tracking in real time with camera inputs. -- Douglas E Knapp, MSAOM, LAc. _______________________________________________ Bf-committers mailing list [email protected] http://lists.blender.org/mailman/listinfo/bf-committers
