Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Laurent Pinchart
Hi Dennis,

On Tuesday 27 April 2010 18:19:01 Dennis Muhlestein wrote:
 On 04/27/2010 05:48 AM, Paulo Assis wrote:
  2010/4/27 Ian Latterian.lat...@midnightcode.org:
  You can grab frames at whatever speed you want, but it's the camera
  framerate that will make a difference in the usb bandwidth and not the
  amount of frames you get with your application.
  
  So, if you don't take a frame from the UVC driver, it will simply
  continue to refresh an internal buffer with new frames from the camera
  device?
  
  Think of it like this, the necessary bandwidth is requested by the device
  and it will depend on the compression, resolution and framerate, in fact I
  think some (buggy) devices will always request the maximum bandwidth
  causing problems on initialization, for these devices I think uvcvideo
  uses it's own algorithm to calc the necessary bandwidth.

That's right. The uvcvideo driver tries to compute an estimate of the required 
bandwidth. That only works for uncompressed formats.

 Do the number of URBs have any effect at all?  I made an initial attempt
 to check this a while back by recompiling the driver with UVC_URBS as 1
 or 2 instead of 5.

Not on the bandwidth, no. The number of URBs will influence the memory used by 
the driver, and the real-time requirements. The less URBs, the fastest the 
driver needs to process the complete URBs and resubmit them to avoid running 
out of them.

 Changed the MAX_ISO_PACKETS a lot smaller too.

The number of packets influences the URB size for isochronous transfers. URBs 
complete when the maximum of packets have been transmitted. The larger the 
number, the less frequent the driver gets interrupted to process URBs (but it 
will also have to process more packets in one go, making the processing a bit 
longer).

 I wanted to see if perhaps submitting less URBs would somehow lessen the
 bandwidth requirements.  It didn't fix the problem though.

The driver explicitly selects the bandwidth by switching to an alternate 
setting. The USB core allocates the bandwidth when the URB are submitted, so 
the alternate setting selection call might succeed and the URB submission fail 
later. Please note that, with USB 3.0, the xHCI driver will perform the 
allocation when selecting the alternate setting.

The driver choose the alternate setting with the lowest bandwidth compatible 
with the bandwidth requirements reported by the camera. The number of URBs or 
packets will have no influence there.

 I suspect that somewhere in an underlying layer, whether kernel or physical,
 something is checking that bandwidth requirement for the configured
 endpoints and then denying new URBs being submitted.  I'm not opposed to
 modifying something at a lower level but I haven't found the spot at this
 point. I guess if the error comes from the physical hub there isn't much to
 be done then.

The USB core checks the bandwidth. There's an allocation process defined by 
the USB standard, and there's pretty much no way around that. You could modify 
the EHCI host controller driver to allow more than 80% of the total bandwidth 
to be allocated to periodic transfers. Another solution, as you pointed out 
below, is not to submit the URBs. That's very similar to performing the 
STREAMON/QBUF/DQBUF/STREAMOFF sequences in userspace on all devices one after 
the other, except that you might be able to get slightly better performances 
by avoiding context switches.

  Also the device will always try to dispatch frames at the requested
  framerate, if you don't grab them they will simply be dropped by the
  driver.
 
 I wonder if there is some way to restrict URB submitting to around 10
 fps? Perhaps a semaphore on the number of cameras that can be submitting
 URBs at all.  If I go ahead and configure all the cameras to run at 15 fps
 but only submit URBs for say 3 of the cameras at a time it seems it would
 work. I'm not worried about dropping 1/3 of the frames.

That might work, but it would be quite hackish. You wouldn't be able to 
synchronize to the frame start, so you will loose half a frame on average 
every time you start a camera.

You would basically need to keep a list of cameras and handle them in a round-
robin way. When one of the cameras completes a frame, you will have to stop 
submitting URBs for it, wait until all URBs have completed, and submit the 
URBs for the next camera in the list.

 I don't think I can write them all out to storage fast enough anyway.
 (I can come close to keeping up with 10 fps on 8 cameras though.)

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Laurent Pinchart
Hi Denis,

On Tuesday 04 May 2010 17:37:13 Dennis Muhlestein wrote:
 On 04/29/2010 12:40 PM, Dennis Muhlestein wrote:
  One idea I've been toying with is to add a semaphore around submitting
  the URBs.
  
  In uvc_video.c, where the URBs are submitted, I'd acquire a semephore
  for each device currently submitting URBs. The semaphore would limit the
  number of devices to whatever number I decide can safely submit URBs
  simultaneously on the bus without throwing out of space errors.
  
  I did an initial test of this and it looks promising. I can configure
  all the cameras. As long as I don't submit the URBs for the number of
  devices beyond that which will work at the same time, the other cameras
  simply drop the data.
  
  I'm not sure the best places to control the locking and unlocking of the
  semaphore are. Right now, I lock it before submitting URBs in
  uvc_init_video. In uvc_video_complete, I unlock it and relock it if the
  buffer is complete (allowing another camera to acquire it and capture a
  frame).
  
  Anyway, it isn't working perfectly yet but I think I can debug it and at
  least get to a point where I know if it's worth pursuing. I'm curious if
  anyone can provide thoughts or alternatives.
 
 I have two solutions that I've come up with this so far.
 1) Modify the uvc_video.c to queue urbs in a list in the urb completion
 handler.  A driver level semaphore controls the number of currently
 submitting cameras.  You can adjust the initial sem_count in
 uvc_driver.c.  Ideally, that would be a module parameter but I'm just
 getting things to work.
 
 I found capture speeds quite a bit slower than I'd like with this method
 though.  I can capture with 8 cameras at 10 FPS without overwhelming the
 ISO bus but if I change to 15 FPS I can only capture with 3 cameras at a
 time.  Adding the patch, I then can configure and capture from all 8
 cameras running at 15 FPS but only submitting URBs for 3 at a time.
 Depending on how many frames I let those cameras capture I got captured
 frame rates from around 4 to 6.
 
 I'm attaching the patch in case anyone wants to play with this or
 suggest ways to improve it.

Interesting approach, but definitely a hack. I'm not sure if it has a chance 
to make it to the driver.

 One thing I had a problem with is that it seems some of the capture images
 are corrupted. This patch was against 2.6.27.2. A little old I know but I'm
 working on an embedded device.
 
 2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution
 actually lets me capture on all 8 cameras at 15 FPS.  This has other
 implications though and I'm not sure it is a very good solution.  It
 does tell me that perhaps the Linux ISO scheduler could perhaps use look
 through.  One major discrepancy is that bandwidth is allocated based on
 the max packet size I think but for my compressed images (MJPG) each
 frame is only a fraction of the max allocated bandwidth.

That might be the real culprit. Maybe the camera is simply requesting too much 
bandwidth compared to its real requirements.

Can you check whether the video streaming interface has multiple alternate 
settings ? If so, which one does the uvcvideo driver chose ? You can get the 
information from the kernel log if you set the UVC_TRACE_VIDEO flag.

If the camera request an alternate setting that isn't the smallest, you might 
try to experiment with hardcoding a lower bandwidth in uvc_video_init.

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Dennis Muhlestein

On 05/07/2010 09:10 AM, Laurent Pinchart wrote:



I'm attaching the patch in case anyone wants to play with this or
suggest ways to improve it.


Interesting approach, but definitely a hack. I'm not sure if it has a chance
to make it to the driver.


Most definitely a hack.  I just included it because someone had 
expressed interest in the solution.  If there was a way to do it faster, 
it might be a good thing to pursue but I can't think of a good way to 
synchronize the frames in such a way to not lose a lot of each frame 
(like you mentioned in your other response) after continuing urb 
submission and waste a lot of transfer bandwidth.





One thing I had a problem with is that it seems some of the capture images
are corrupted. This patch was against 2.6.27.2. A little old I know but I'm
working on an embedded device.

2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution
actually lets me capture on all 8 cameras at 15 FPS.  This has other
implications though and I'm not sure it is a very good solution.  It
does tell me that perhaps the Linux ISO scheduler could perhaps use look
through.  One major discrepancy is that bandwidth is allocated based on
the max packet size I think but for my compressed images (MJPG) each
frame is only a fraction of the max allocated bandwidth.


That might be the real culprit. Maybe the camera is simply requesting too much
bandwidth compared to its real requirements.

Can you check whether the video streaming interface has multiple alternate
settings ? If so, which one does the uvcvideo driver chose ? You can get the
information from the kernel log if you set the UVC_TRACE_VIDEO flag.



The camera is definitely selecting more bandwidth than it needs.  The 
size of each image does vary when I'm streaming MJPG images.  It is 
usually somewhere along the lines of 110-150K at maximum resolution 
(1280x1024 for these cameras).

Perhaps this is the problem with these cameras:
Here is the information from the Uncompressed video format descriptor 
for the 1280x1024 YUYV data:

dwMaxVideoFrameBufferSize 2621440
And here is the information returned from the MJPG descriptor for the 
same sized compressed image:

dwMaxVideoFrameBufferSize 2621440

Logitech C500 by the way.


If the camera request an alternate setting that isn't the smallest, you might
try to experiment with hardcoding a lower bandwidth in uvc_video_init.



I think I actually tried that a while back when I first started playing 
with this issue.  If I recall, selecting a lower bandwidth interface 
caused a mismatch between the information in each packets and the 
placement of the frame headers in the UVC protocol.  Result was it never 
dequeued a buffer.  I'm not sure that isn't worth investigating further 
though.  I may have done something wrong.


Thank for your analysis of this.  Helps to have more than just my eyes 
look over the issue.


In the mean time, I'm having quite good luck with my EHCI hack.  I just 
let all the ISO frames through and I get all the images from all 8 
cameras back at 15 FPS.  I may come back to this problem before I'm all 
the way finished and see if there is a way to button this up cleaner but 
temporarily the solution is working for me.


-Dennis


___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-04 Thread Dennis Muhlestein

On 04/29/2010 12:40 PM, Dennis Muhlestein wrote:


One idea I've been toying with is to add a semaphore around submitting
the URBs.

In uvc_video.c, where the URBs are submitted, I'd acquire a semephore
for each device currently submitting URBs. The semaphore would limit the
number of devices to whatever number I decide can safely submit URBs
simultaneously on the bus without throwing out of space errors.

I did an initial test of this and it looks promising. I can configure
all the cameras. As long as I don't submit the URBs for the number of
devices beyond that which will work at the same time, the other cameras
simply drop the data.

I'm not sure the best places to control the locking and unlocking of the
semaphore are. Right now, I lock it before submitting URBs in
uvc_init_video. In uvc_video_complete, I unlock it and relock it if the
buffer is complete (allowing another camera to acquire it and capture a
frame).

Anyway, it isn't working perfectly yet but I think I can debug it and at
least get to a point where I know if it's worth pursuing. I'm curious if
anyone can provide thoughts or alternatives.



I have two solutions that I've come up with this so far.
1) Modify the uvc_video.c to queue urbs in a list in the urb completion 
handler.  A driver level semaphore controls the number of currently 
submitting cameras.  You can adjust the initial sem_count in 
uvc_driver.c.  Ideally, that would be a module parameter but I'm just 
getting things to work.


I found capture speeds quite a bit slower than I'd like with this method 
though.  I can capture with 8 cameras at 10 FPS without overwhelming the 
ISO bus but if I change to 15 FPS I can only capture with 3 cameras at a 
time.  Adding the patch, I then can configure and capture from all 8 
cameras running at 15 FPS but only submitting URBs for 3 at a time. 
Depending on how many frames I let those cameras capture I got captured 
frame rates from around 4 to 6.


I'm attaching the patch in case anyone wants to play with this or 
suggest ways to improve it.  One thing I had a problem with is that it 
seems some of the capture images are corrupted.  This patch was against 
2.6.27.2.  A little old I know but I'm working on an embedded device.


2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution 
actually lets me capture on all 8 cameras at 15 FPS.  This has other 
implications though and I'm not sure it is a very good solution.  It 
does tell me that perhaps the Linux ISO scheduler could perhaps use look 
through.  One major discrepancy is that bandwidth is allocated based on 
the max packet size I think but for my compressed images (MJPG) each 
frame is only a fraction of the max allocated bandwidth.


-Dennis

diff --git a/uvc_driver.c b/uvc_driver.c
index 7e10203..6445ec1 100644
--- a/uvc_driver.c
+++ b/uvc_driver.c
@@ -1963,8 +1963,12 @@ static int __init uvc_init(void)
 
INIT_LIST_HEAD(uvc_driver.devices);
INIT_LIST_HEAD(uvc_driver.controls);
+INIT_LIST_HEAD(uvc_driver.wait_urbs);
mutex_init(uvc_driver.open_mutex);
mutex_init(uvc_driver.ctrl_mutex);
+//mutex_init(uvc_driver.urb_mutex);
+uvc_driver.urb_lock = SPIN_LOCK_UNLOCKED;
+uvc_driver.sem_count=3;
 
uvc_ctrl_init();
 
diff --git a/uvc_v4l2.c b/uvc_v4l2.c
index b27cf5c..6376759 100644
--- a/uvc_v4l2.c
+++ b/uvc_v4l2.c
@@ -427,6 +427,10 @@ static int uvc_v4l2_open(struct inode *inode, struct file 
*file)
goto done;
}
 
+video-has_sem=0;
+video-sent_urbs=0;
+video-release_on_no_urbs=0;
+video-sem_frame =0;
handle-device = video;
handle-state = UVC_HANDLE_PASSIVE;
file-private_data = handle;
diff --git a/uvc_video.c b/uvc_video.c
index 6854ac7..f1b9673 100644
--- a/uvc_video.c
+++ b/uvc_video.c
@@ -26,6 +26,9 @@
 
 #include uvcvideo.h
 
+// predef
+static void uvc_video_frame_complete(struct uvc_video_device *);
+
 /* 
  * UVC Controls
  */
@@ -415,6 +418,7 @@ static void uvc_video_decode_end(struct uvc_video_device 
*video,
buf-state = UVC_BUF_STATE_DONE;
if (video-dev-quirks  UVC_QUIRK_STREAM_NO_FID)
video-last_fid ^= UVC_STREAM_FID;
+uvc_video_frame_complete(video);
}
 }
 
@@ -524,13 +528,161 @@ static void uvc_video_decode_bulk(struct urb *urb,
}
 }
 
+
+/**
+ * Incs the resource count for allows video devices
+ * if the specified video device currently has a 
+ * resource locked.
+ **/
+static void uvc_video_frame_complete(struct uvc_video_device *video) {
+  
+ // mutex_lock ( uvc_driver.urb_mutex ); 
+ unsigned long flags;
+ spin_lock_irqsave ( uvc_driver.urb_lock, flags );
+  //uvc_printk ( KERN_WARNING, Frame Complete: %s\n, 
video-dev-udev-serial );
+  video-sem_frame += 1;
+  if ( video-has_sem  video-sem_frame  2 ) {
+if (!video-sent_urbs) {
+video-has_sem = 0;
+

Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Ian Latter
Hello,

  Do they need to be simultaneous?  I.e. from your
description, ideally they would be, but then the higher
the frame-rate the lesser the delta between frames
(assuming your solution moves slowly or observes
something that moves slowly).

  Implementation could be interesting.  I don't know what 
UVC does when you don't de-queue the capture buffer 
(VIDIOC_DQBUF), whether the camera captures 
frames but doesn't send them to the host over the wire,
or whether the camera captures frames and sends 
them to the host to have the host UVC driver dump 
them.

  *However* 

  Assuming the former; you could try setting the 
frame rate that you want on all of the cameras in
your solution, then running a capture thread that
walks the cameras sequentially in a 10-cameras-per-
second walk to DQ them.
  If you think you can sustain multiple cameras 
simultaneously, then you could kick off (concurrency) 
capture threads of 10-cameras-per-second for 
(total/concurrency) cameras each. 

  This is silly - I don't remember the timing models - 
you should be able to kick off a thread per camera 
and use a mutex to prevent unwanted simultanety, and
then use signals with select(?) in each thread to 
manage its own timing .. or did that result in a signal
for the process, rather than the thread .. hmm .. maybe
you should be mixing both ideas.

  Either way do any further image handling in a 
processing thread to keep your image handling out
of your capture timing ...


Chalk it up as one thought ;-)



- Original Message -
From: Dennis Muhlestein djmuhlest...@gmail.com
To: linux-uvc-devel@lists.berlios.de
Subject:  [Linux-uvc-devel] Multiple camera framerate.
Date: Mon, 26 Apr 2010 14:43:39 -0600

 I have a project I've been playing with that uses a number
of cameras to 
 construct video from 360 degrees.  All cameras can be
configured as long 
 as I don't overload the underlying ISO bandwidth limits. 
(No space 
 left on device. has been discussed a few times on this list.)
 
 The application works well, but I'd like to experiment
with making the 
 video quality better.  There is always some sheering when
the cameras 
 are moving but the higher framerate I can capture at, the
less the 
 sheering effect will degrade the video quality.
 
 Can anyone suggest a way to configure the cameras at a
higher framerate 
 without overloading the USB bus?  Suppose I can read at 10
fps right now 
 without overloading the USB bus.  I'd like to set the
framerate to 15, 
 but still just capture around 10.
 
 Any thoughts?
 
 Thanks
 Dennis
 ___
 Linux-uvc-devel mailing list
 Linux-uvc-devel@lists.berlios.de
 https://lists.berlios.de/mailman/listinfo/linux-uvc-devel
 


--
Ian Latter
Late night coder ..
http://midnightcode.org/
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Paulo Assis
Dennis,

 Can anyone suggest a way to configure the cameras at a higher framerate
 without overloading the USB bus?  Suppose I can read at 10 fps right now
 without overloading the USB bus.  I'd like to set the framerate to 15, but
 still just capture around 10.

You can grab frames at whatever speed you want, but it's the camera
framerate that will make a difference in the usb bandwidth and not the
amount of frames you get with your application.
Is the 10 fps limit achieved with compressed (MJPG) or uncompressed
(YUV) frames ?
Using compressed frames should allow for much higher framerates when
using multiple cameras. Resolution will also have an impact on the
bandwidth.
Another alternative is to add another usb controler to you system, (a
usb pci card if you are using a standard pc).
As an example using MJPG I have no problem in using 3 cameras
simultaneous (800x...@20 fps)


Best Regards,
Paulo
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Ian Latter
 You can grab frames at whatever speed you want, but it's
the camera
 framerate that will make a difference in the usb bandwidth
and not the
 amount of frames you get with your application.

So, if you don't take a frame from the UVC driver, it will
simply
continue to refresh an internal buffer with new frames from the
camera device?



--
Ian Latter
Late night coder ..
http://midnightcode.org/
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Paulo Assis
Ian

2010/4/27 Ian Latter ian.lat...@midnightcode.org:
 You can grab frames at whatever speed you want, but it's
 the camera
 framerate that will make a difference in the usb bandwidth
 and not the
 amount of frames you get with your application.

 So, if you don't take a frame from the UVC driver, it will
 simply
 continue to refresh an internal buffer with new frames from the
 camera device?

Think of it like this, the necessary bandwidth is requested by the
device and it will depend on the compression, resolution and
framerate, in fact I think some (buggy) devices will always request
the maximum bandwidth causing problems on initialization, for these
devices I think uvcvideo uses it's own algorithm to calc the necessary
bandwidth.

Also the device will always try to dispatch frames at the requested
framerate, if you don't grab them they will simply be dropped by the
driver.

Best regards,
Paulo
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Dennis Muhlestein

On 04/27/2010 04:52 AM, Paulo Assis wrote:

Dennis,


Can anyone suggest a way to configure the cameras at a higher framerate
without overloading the USB bus?  Suppose I can read at 10 fps right now
without overloading the USB bus.  I'd like to set the framerate to 15, but
still just capture around 10.


You can grab frames at whatever speed you want, but it's the camera
framerate that will make a difference in the usb bandwidth and not the
amount of frames you get with your application.


This is correct.  No matter what speed I dequeue buffers at, if the 
cameras are configured to capture at 15 fps I can't start more than 
about 3 at a time.  The rest will error with No space left on device 
which is passed up from the usb layer when the UVC drivers tries to 
queue the USB URBs.  (Note that the error happens when URBs are 
submitted, not when the interface is claimed, which tells me that 
perhaps there is a way modify something in the kernel to not submit as 
many URBs but still capture the images from all the cameras since it's 
at the point the interface is claimed that the camera is ready to send 
images at 15 fps.)



Is the 10 fps limit achieved with compressed (MJPG) or uncompressed
(YUV) frames ?


MJPG.  I can read a lot more frames/sec with MJPG than with YUYV.


Using compressed frames should allow for much higher framerates when
using multiple cameras. Resolution will also have an impact on the
bandwidth.


True again.  I can read up to 20 fps with 8 cameras in VGA mode.  I'm 
interested in capturing as high a resolution as possible.  Above 960x720 
and I can only do 10 fps before the bandwidth limit.



Another alternative is to add another usb controler to you system, (a
usb pci card if you are using a standard pc).
As an example using MJPG I have no problem in using 3 cameras
simultaneous (800x...@20 fps)



It's actually an embedded device.  I am using an external hub to connect 
the cameras but there is only one USB bus on the base unit.  I'm not 
sure exactly where the No space left on device error is originating. 
Anyone know if this is an error propagated back from the physical USB 
hub or if somewhere in the kernel there is code attempting to follow the 
USB spec and not allocate over a certain percentage of the bandwidth to 
ISO endpoints?


-Dennis
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-04-27 Thread Dennis Muhlestein

On 04/27/2010 04:08 AM, Dr. Alexander K. Seewald wrote:

Hi Dennis,

I don't think it is possible to decouple the recording frame rate
from the output framerate, at least not with the UVC firmware out
there. But you could try to switch off auto exposure and set it
manually if your cameras expose this control. A smaller exposure
time should IMHO also solve your problem at the cost of getting
slightly darker images (you can fix this by increasing lighting ;-)



Exposure would help with motion blur, but the shearing problem is 
specifically due to the framerate.  We're actually not having a problem 
with motion blur when just leaving the cameras in their auto exposure mode.


Thanks
Dennis
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel