Hi Frans,
sorry that it took us so long to reply, but for several reasons we did
not manage to give a quick reply on your mail.
We here in Osnabrück work on a lecturer tracking system for a while too.
Unfortunately it is not a project with a high priority so that the
progress is not as fast as we hoped.
As we now had a deadline for the sakai conference in L.A. we decided
that we need some visual results of our work so far. So here's a small
demo video [1]. We used the same video that you used in your examples
and that was in your repository, for a better comparison. I hope that
this is okay?
We currently have the working title OpenTrack for our project [2]. I is
based on the same technology as Matterhorn (Java, OSGI) and is intended
to run on the same box as the Matterhorn Capture agent. To keep the
budget for the capture agent moderate we decided to use OpenCL for the
video-analysis so that this can be processed on the graphics card.
For OpenTrack you need two cameras. A cheap webcam and a Sony VISCA
compatible pan-tilt-camera. They should be mounted close to each other.
You can separate the OpenTrack software in three parts:
1. Video Analysis: we pre-process the image; look for movement in the
image; build a background model so that we know which parts of the image
are not important. As a result we get coordinates for the detected
objects (hopefully people). Currently we try to give the boxes an
identity, so that we can really track the object when it moves.
2. Virtual Director: this component is responsible for mapping the
analysis results to camera movement and it decides what the
pan-tilt-cameras will show. It component is in a very early stage. There
is a prototype from a bachelor thesis at the ETH. But we are still in a
very early stage here.
3. Camera Control: This already finished component controls the VISCA
cameras connected to the computer. We currently implement a web
interface that will allow a manual camera control too.
All of these parts are separate OSGI modules that are replaceable by
other implementations. So if you want to use different cameras you can
create your own module and still use the other two components. Or you
may want to implement your own virtual director.
We will release OpenTrack with an open source licence but we haven't
deciced which exactly yet. I guess that it will be GPL or LGPL. We would
be happy if there will be ways to cooperate with you and I hope that
this will not again fail because of the wrong software license.
Rüdiger
[1]
http://video.virtuos.uni-osnabrueck.de:8080/engage/ui/watch.html?id=d0ac61d7-095c-4606-bd76-0eed5a9ffdd7&quality=high
[2] http://opencast.jira.com/browse/OPENTRACK
Am 03.05.2011 16:47, schrieb Frans Ward:
Dear Opencast Community,
SURFnet is working on the project 'Lecturerecorder', which includes a
technology exploration of video tracking Systems.
The goal of this project is to find out ways to aid automatic capture of
lectures by automatically operating the video editing software or the
camera. As a proof of concept, we will write pieces of software for the
three major stages. In this proof of concept we will limit ourselves to
bare bones functionality, and will only implement the scenario where we
operate on pre-recorded video. The actions performed on the video can
later be converted to camera control relatively easy.
More info, the code and some demo videos can be found on the project
page. A good start would be:
http://code.google.com/p/lecturerecorder/wiki/Introduction
This is still work in progress, but I'm curious if there are others in
the community working on similar projects.
Cheers, Frans
--
________________________________________________
Rüdiger Rolf, M.A.
Universität Osnabrück - Zentrum virtUOS
Heger-Tor-Wall 12, 49069 Osnabrück
Telefon: (0541) 969-6511 - Fax: (0541) 969-16511
E-Mail: [email protected]
Internet: www.virtuos.uni-osnabrueck.de
_______________________________________________
Community mailing list
[email protected]
http://lists.opencastproject.org/mailman/listinfo/community
To unsubscribe please email
[email protected]
_______________________________________________