Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Dennis Muhlestein

On 05/07/2010 09:10 AM, Laurent Pinchart wrote:



I'm attaching the patch in case anyone wants to play with this or
suggest ways to improve it.


Interesting approach, but definitely a hack. I'm not sure if it has a chance
to make it to the driver.


Most definitely a hack.  I just included it because someone had 
expressed interest in the solution.  If there was a way to do it faster, 
it might be a good thing to pursue but I can't think of a good way to 
synchronize the frames in such a way to not lose a lot of each frame 
(like you mentioned in your other response) after continuing urb 
submission and waste a lot of transfer bandwidth.





One thing I had a problem with is that it seems some of the capture images
are corrupted. This patch was against 2.6.27.2. A little old I know but I'm
working on an embedded device.

2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution
actually lets me capture on all 8 cameras at 15 FPS.  This has other
implications though and I'm not sure it is a very good solution.  It
does tell me that perhaps the Linux ISO scheduler could perhaps use look
through.  One major discrepancy is that bandwidth is allocated based on
the max packet size I think but for my compressed images (MJPG) each
frame is only a fraction of the max allocated bandwidth.


That might be the real culprit. Maybe the camera is simply requesting too much
bandwidth compared to its real requirements.

Can you check whether the video streaming interface has multiple alternate
settings ? If so, which one does the uvcvideo driver chose ? You can get the
information from the kernel log if you set the UVC_TRACE_VIDEO flag.



The camera is definitely selecting more bandwidth than it needs.  The 
size of each image does vary when I'm streaming MJPG images.  It is 
usually somewhere along the lines of 110-150K at maximum resolution 
(1280x1024 for these cameras).

Perhaps this is the problem with these cameras:
Here is the information from the Uncompressed video format descriptor 
for the 1280x1024 YUYV data:

dwMaxVideoFrameBufferSize 2621440
And here is the information returned from the MJPG descriptor for the 
same sized compressed image:

dwMaxVideoFrameBufferSize 2621440

Logitech C500 by the way.


If the camera request an alternate setting that isn't the smallest, you might
try to experiment with hardcoding a lower bandwidth in uvc_video_init.



I think I actually tried that a while back when I first started playing 
with this issue.  If I recall, selecting a lower bandwidth interface 
caused a mismatch between the information in each packets and the 
placement of the frame headers in the UVC protocol.  Result was it never 
dequeued a buffer.  I'm not sure that isn't worth investigating further 
though.  I may have done something wrong.


Thank for your analysis of this.  Helps to have more than just my eyes 
look over the issue.


In the mean time, I'm having quite good luck with my EHCI hack.  I just 
let all the ISO frames through and I get all the images from all 8 
cameras back at 15 FPS.  I may come back to this problem before I'm all 
the way finished and see if there is a way to button this up cleaner but 
temporarily the solution is working for me.


-Dennis


___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-04 Thread Dennis Muhlestein

On 04/29/2010 12:40 PM, Dennis Muhlestein wrote:


One idea I've been toying with is to add a semaphore around submitting
the URBs.

In uvc_video.c, where the URBs are submitted, I'd acquire a semephore
for each device currently submitting URBs. The semaphore would limit the
number of devices to whatever number I decide can safely submit URBs
simultaneously on the bus without throwing out of space errors.

I did an initial test of this and it looks promising. I can configure
all the cameras. As long as I don't submit the URBs for the number of
devices beyond that which will work at the same time, the other cameras
simply drop the data.

I'm not sure the best places to control the locking and unlocking of the
semaphore are. Right now, I lock it before submitting URBs in
uvc_init_video. In uvc_video_complete, I unlock it and relock it if the
buffer is complete (allowing another camera to acquire it and capture a
frame).

Anyway, it isn't working perfectly yet but I think I can debug it and at
least get to a point where I know if it's worth pursuing. I'm curious if
anyone can provide thoughts or alternatives.



I have two solutions that I've come up with this so far.
1) Modify the uvc_video.c to queue urbs in a list in the urb completion 
handler.  A driver level semaphore controls the number of currently 
submitting cameras.  You can adjust the initial sem_count in 
uvc_driver.c.  Ideally, that would be a module parameter but I'm just 
getting things to work.


I found capture speeds quite a bit slower than I'd like with this method 
though.  I can capture with 8 cameras at 10 FPS without overwhelming the 
ISO bus but if I change to 15 FPS I can only capture with 3 cameras at a 
time.  Adding the patch, I then can configure and capture from all 8 
cameras running at 15 FPS but only submitting URBs for 3 at a time. 
Depending on how many frames I let those cameras capture I got captured 
frame rates from around 4 to 6.


I'm attaching the patch in case anyone wants to play with this or 
suggest ways to improve it.  One thing I had a problem with is that it 
seems some of the capture images are corrupted.  This patch was against 
2.6.27.2.  A little old I know but I'm working on an embedded device.


2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution 
actually lets me capture on all 8 cameras at 15 FPS.  This has other 
implications though and I'm not sure it is a very good solution.  It 
does tell me that perhaps the Linux ISO scheduler could perhaps use look 
through.  One major discrepancy is that bandwidth is allocated based on 
the max packet size I think but for my compressed images (MJPG) each 
frame is only a fraction of the max allocated bandwidth.


-Dennis

diff --git a/uvc_driver.c b/uvc_driver.c
index 7e10203..6445ec1 100644
--- a/uvc_driver.c
+++ b/uvc_driver.c
@@ -1963,8 +1963,12 @@ static int __init uvc_init(void)
 
INIT_LIST_HEAD(uvc_driver.devices);
INIT_LIST_HEAD(uvc_driver.controls);
+INIT_LIST_HEAD(uvc_driver.wait_urbs);
mutex_init(uvc_driver.open_mutex);
mutex_init(uvc_driver.ctrl_mutex);
+//mutex_init(uvc_driver.urb_mutex);
+uvc_driver.urb_lock = SPIN_LOCK_UNLOCKED;
+uvc_driver.sem_count=3;
 
uvc_ctrl_init();
 
diff --git a/uvc_v4l2.c b/uvc_v4l2.c
index b27cf5c..6376759 100644
--- a/uvc_v4l2.c
+++ b/uvc_v4l2.c
@@ -427,6 +427,10 @@ static int uvc_v4l2_open(struct inode *inode, struct file 
*file)
goto done;
}
 
+video-has_sem=0;
+video-sent_urbs=0;
+video-release_on_no_urbs=0;
+video-sem_frame =0;
handle-device = video;
handle-state = UVC_HANDLE_PASSIVE;
file-private_data = handle;
diff --git a/uvc_video.c b/uvc_video.c
index 6854ac7..f1b9673 100644
--- a/uvc_video.c
+++ b/uvc_video.c
@@ -26,6 +26,9 @@
 
 #include uvcvideo.h
 
+// predef
+static void uvc_video_frame_complete(struct uvc_video_device *);
+
 /* 
  * UVC Controls
  */
@@ -415,6 +418,7 @@ static void uvc_video_decode_end(struct uvc_video_device 
*video,
buf-state = UVC_BUF_STATE_DONE;
if (video-dev-quirks  UVC_QUIRK_STREAM_NO_FID)
video-last_fid ^= UVC_STREAM_FID;
+uvc_video_frame_complete(video);
}
 }
 
@@ -524,13 +528,161 @@ static void uvc_video_decode_bulk(struct urb *urb,
}
 }
 
+
+/**
+ * Incs the resource count for allows video devices
+ * if the specified video device currently has a 
+ * resource locked.
+ **/
+static void uvc_video_frame_complete(struct uvc_video_device *video) {
+  
+ // mutex_lock ( uvc_driver.urb_mutex ); 
+ unsigned long flags;
+ spin_lock_irqsave ( uvc_driver.urb_lock, flags );
+  //uvc_printk ( KERN_WARNING, Frame Complete: %s\n, 
video-dev-udev-serial );
+  video-sem_frame += 1;
+  if ( video-has_sem  video-sem_frame  2 ) {
+if (!video-sent_urbs) {
+video-has_sem = 0

Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Dennis Muhlestein

On 04/27/2010 04:52 AM, Paulo Assis wrote:

Dennis,


Can anyone suggest a way to configure the cameras at a higher framerate
without overloading the USB bus?  Suppose I can read at 10 fps right now
without overloading the USB bus.  I'd like to set the framerate to 15, but
still just capture around 10.


You can grab frames at whatever speed you want, but it's the camera
framerate that will make a difference in the usb bandwidth and not the
amount of frames you get with your application.


This is correct.  No matter what speed I dequeue buffers at, if the 
cameras are configured to capture at 15 fps I can't start more than 
about 3 at a time.  The rest will error with No space left on device 
which is passed up from the usb layer when the UVC drivers tries to 
queue the USB URBs.  (Note that the error happens when URBs are 
submitted, not when the interface is claimed, which tells me that 
perhaps there is a way modify something in the kernel to not submit as 
many URBs but still capture the images from all the cameras since it's 
at the point the interface is claimed that the camera is ready to send 
images at 15 fps.)



Is the 10 fps limit achieved with compressed (MJPG) or uncompressed
(YUV) frames ?


MJPG.  I can read a lot more frames/sec with MJPG than with YUYV.


Using compressed frames should allow for much higher framerates when
using multiple cameras. Resolution will also have an impact on the
bandwidth.


True again.  I can read up to 20 fps with 8 cameras in VGA mode.  I'm 
interested in capturing as high a resolution as possible.  Above 960x720 
and I can only do 10 fps before the bandwidth limit.



Another alternative is to add another usb controler to you system, (a
usb pci card if you are using a standard pc).
As an example using MJPG I have no problem in using 3 cameras
simultaneous (800x...@20 fps)



It's actually an embedded device.  I am using an external hub to connect 
the cameras but there is only one USB bus on the base unit.  I'm not 
sure exactly where the No space left on device error is originating. 
Anyone know if this is an error propagated back from the physical USB 
hub or if somewhere in the kernel there is code attempting to follow the 
USB spec and not allocate over a certain percentage of the bandwidth to 
ISO endpoints?


-Dennis
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Dennis Muhlestein

On 04/27/2010 04:08 AM, Dr. Alexander K. Seewald wrote:

Hi Dennis,

I don't think it is possible to decouple the recording frame rate
from the output framerate, at least not with the UVC firmware out
there. But you could try to switch off auto exposure and set it
manually if your cameras expose this control. A smaller exposure
time should IMHO also solve your problem at the cost of getting
slightly darker images (you can fix this by increasing lighting ;-)



Exposure would help with motion blur, but the shearing problem is 
specifically due to the framerate.  We're actually not having a problem 
with motion blur when just leaving the cameras in their auto exposure mode.


Thanks
Dennis
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel