Dear Carlos,
sorry for the late answer. Logging of the tracking data is currently not
implemented. We already thought about attaching such a log to the
MediaPackage that the capture agent produces, preferably in the form of
an MPEG-7 document. Such a feature shouldn't be hard to implement.
Apart from that, shifting the whole analysis to post-processing offers
nice advantages. First to mention at this point is the fact, that in
post-processing the tracking algorithm has the possibility to look ahead
in the video and thus doesn't need to produces and refine hypothesises
based on the life stream. Instead it can simply grab the 'truth from the
future'. In fact cropping an HD video in post-processing is a valuable
idea that we always wanted to take up.
Cheers,
Benjamin
On 08/30/2012 10:33 PM, Carlos Turro Ribalta wrote:
Dear Olaf
It's great what you tell and the demo for LectureSight. Of course this is better
when the classroom has a Visca camera, but maybe this can be used also with fixed
cameras by doing a crop & zoom of a part of the frame. In fact I am thinking
that this could be done also after the capturing, so different strategies of
Virtual Director could be used. To do that it would be mandatory to have any kind
of log of the tracking session, and maybe this could be added to the media package
itself.
Is any log currently being recorded?
Regards
Carlos Turro
Head of Media Services
Universitat Politècnica de Valencia
-----Mensaje original-----
De: [email protected]
[mailto:[email protected]] En nombre de Benjamin Wulff
Enviado el: jueves, 30 de agosto de 2012 21:42
Para: [email protected]
Asunto: Re: [Opencast] Automated camera tracking to be integrated with Opencast
Matterhorn
Dear Jack,
what we have put together in the last two month is a prototypical technology
stack for an automated camera control system in a lecture room. For this sprint
we have a fairly narrow (though still challenging) set of aims we want achieve,
namely to provide a stable tracking of the lecturer throughout the whole
presentation. This is what we will mainly test and tune the system against in
the next semester.
However, there is no doubt that more complex camera operating strategies will
be needed in practice. From the beginning LectureSight was developed bearing
this in mind. The part of the system that controls the PTZ-camera was split
into two separate modules for this reason: The first module, the so called
'camera steering worker', cares about moving the camera, while the second
module decides about the targets for the camera movement based on data provided
by the scene analysis. The actual strategy this 'virtual director' uses is
defined via a JavaScript API. The idea is to provide maximum flexibility when
it comes to customizing LectureSight for an individual class room.
Looking at the scenario at TAU you described I can tell you that here in
Osnabrueck we will test the system next semester in a course in which the
lecturer makes heavy use of the blackboards. So, beyond optimization of the
tracking, we will have the chance to play with different camera operating
strategies in a scenario similar to yours.
Cheers,
Benjamin
On 08/30/2012 10:41 AM, Jack Barokas wrote:
Dear Olaf,
This project is very interesting for us since most of the lecture recordings in
TAU are made by students sitting behind the camera and tracing the lecturer.
Though the stuff cost is not so high here we have quite a problem finding
students to keep on working at least for a whole semester long:(.
The most difficult problem in lecturer tracing system, in my opinion, is not
the processes of tracing the lecturer or identify him from others crossing the
camera frame but to make the intelligent decision whether to keep on tracking
the lecturer while he moves away after writing something on the board or stop
the tracking and remain on the written content on the blackboard previously
written by the lecturer.
I would like to know if there is an strategy on how to deal with this problem
on the work of Benjamin Wulff ?
Any way the demo clip is very impressive and I am sure it is a big step towards
lecture recordings automation.
Cheers to Benjamin Wulff and others working with him on this project.
Best
Jack
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of Schulte Olaf A.
Sent: Thursday, August 30, 2012 11:06 AM
To: Opencast Matterhorn; Opencast Community; Opencast Matterhorn
Subject: [Opencast] Automated camera tracking to be integrated with Opencast
Matterhorn
Dear All
ELAN e.V. together with Osnabruck University and ETH Zurich would like to announce they
are working towards an automated camera tracking system "LectureSight", fully
integrated with Opencast Matterhorn.
The module will provide a solution to automatically track speakers in a
classroom situation, thus providing a more convenient shot of the instructor
when compared to a fixed overview camera. The goal is to come to a subjective
quality close to a manned camera in a standard setting, thus reducing the
staffing cost for lecture recording significantly.
Technically , the software consists of a set of OSGI bundles that can be added
to an existing Matterhorn system or live as a stand-alone application. The
video analysis portions of the system utilize the GPU through the OpenCL
standard. The software breaks down into six major components:
1. a scene analysis that discovers and tracks the positions of moving objects
in the scene 2. a plug-in mechanism for object analysis modules 3. a facility
for communicating with PTZ cameras (currently supports Sony VISCA protocol) 4.
a module that controls the movement of the PTZ camera 5. a virtual camera
operator that executes camera control strategies defined in JavaScript 6. a GUI
for tuning the system for individual class rooms
The project is based on a bachelor thesis by Benjamin Wulff (described at
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=6123405&abstractAccess=no&userType=inst
with the original project name "OpenTrack") and has Benjamin leading this effort with
contributions from Alexander Fecke. LectureSight will be piloted in situ at UOS and ETH in the
fall semester 2012. Expect to be informed about first results by the end of the year.
For a sneak preview, take a look at the current development snapshot
(http://video2.virtuos.uni-osnabrueck.de:8080/engage/ui/watch.html?id=6dba66d9-0a53-45ed-bc67-7ab3bf50b31c)
or go to the project website http://lecturesight.org/.
Feedback and questions are welcome.
Regards
Rüdiger, Olaf A.
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________