Re: [Linux-uvc-devel] Framerate of Quickcam Pro 9000

2010-05-07 Thread Laurent Pinchart
Hi Paulo,

On Tuesday 04 May 2010 10:36:19 Paulo Assis wrote:
  What happens to the camera settings when the computer is rebooted? Are
  they stored in some non-volatile memory in the camera? That would be
  awesome.
 
 If I'm not mistaken USB keeps the power on, even during a reboot, so
 the camera should maintain the control settings.

I'm not sure that's guaranteed by the USB standard. I believe the camera 
should be reset when the computer is restarted. Whether it will loose its 
settings might depend on the camera firmware.

In any case, there's no non-volatile memory to store the controls in any of 
the consumer webcams I know of.

 You would have to unplug the camera to loose the settings.
 This is not always a good think, sometimes the camera may crash,
 meaning that you must physically unplug it to make it work again.

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Laurent Pinchart
Hi Dennis,

On Tuesday 27 April 2010 18:19:01 Dennis Muhlestein wrote:
 On 04/27/2010 05:48 AM, Paulo Assis wrote:
  2010/4/27 Ian Latterian.lat...@midnightcode.org:
  You can grab frames at whatever speed you want, but it's the camera
  framerate that will make a difference in the usb bandwidth and not the
  amount of frames you get with your application.
  
  So, if you don't take a frame from the UVC driver, it will simply
  continue to refresh an internal buffer with new frames from the camera
  device?
  
  Think of it like this, the necessary bandwidth is requested by the device
  and it will depend on the compression, resolution and framerate, in fact I
  think some (buggy) devices will always request the maximum bandwidth
  causing problems on initialization, for these devices I think uvcvideo
  uses it's own algorithm to calc the necessary bandwidth.

That's right. The uvcvideo driver tries to compute an estimate of the required 
bandwidth. That only works for uncompressed formats.

 Do the number of URBs have any effect at all?  I made an initial attempt
 to check this a while back by recompiling the driver with UVC_URBS as 1
 or 2 instead of 5.

Not on the bandwidth, no. The number of URBs will influence the memory used by 
the driver, and the real-time requirements. The less URBs, the fastest the 
driver needs to process the complete URBs and resubmit them to avoid running 
out of them.

 Changed the MAX_ISO_PACKETS a lot smaller too.

The number of packets influences the URB size for isochronous transfers. URBs 
complete when the maximum of packets have been transmitted. The larger the 
number, the less frequent the driver gets interrupted to process URBs (but it 
will also have to process more packets in one go, making the processing a bit 
longer).

 I wanted to see if perhaps submitting less URBs would somehow lessen the
 bandwidth requirements.  It didn't fix the problem though.

The driver explicitly selects the bandwidth by switching to an alternate 
setting. The USB core allocates the bandwidth when the URB are submitted, so 
the alternate setting selection call might succeed and the URB submission fail 
later. Please note that, with USB 3.0, the xHCI driver will perform the 
allocation when selecting the alternate setting.

The driver choose the alternate setting with the lowest bandwidth compatible 
with the bandwidth requirements reported by the camera. The number of URBs or 
packets will have no influence there.

 I suspect that somewhere in an underlying layer, whether kernel or physical,
 something is checking that bandwidth requirement for the configured
 endpoints and then denying new URBs being submitted.  I'm not opposed to
 modifying something at a lower level but I haven't found the spot at this
 point. I guess if the error comes from the physical hub there isn't much to
 be done then.

The USB core checks the bandwidth. There's an allocation process defined by 
the USB standard, and there's pretty much no way around that. You could modify 
the EHCI host controller driver to allow more than 80% of the total bandwidth 
to be allocated to periodic transfers. Another solution, as you pointed out 
below, is not to submit the URBs. That's very similar to performing the 
STREAMON/QBUF/DQBUF/STREAMOFF sequences in userspace on all devices one after 
the other, except that you might be able to get slightly better performances 
by avoiding context switches.

  Also the device will always try to dispatch frames at the requested
  framerate, if you don't grab them they will simply be dropped by the
  driver.
 
 I wonder if there is some way to restrict URB submitting to around 10
 fps? Perhaps a semaphore on the number of cameras that can be submitting
 URBs at all.  If I go ahead and configure all the cameras to run at 15 fps
 but only submit URBs for say 3 of the cameras at a time it seems it would
 work. I'm not worried about dropping 1/3 of the frames.

That might work, but it would be quite hackish. You wouldn't be able to 
synchronize to the frame start, so you will loose half a frame on average 
every time you start a camera.

You would basically need to keep a list of cameras and handle them in a round-
robin way. When one of the cameras completes a frame, you will have to stop 
submitting URBs for it, wait until all URBs have completed, and submit the 
URBs for the next camera in the list.

 I don't think I can write them all out to storage fast enough anyway.
 (I can come close to keeping up with 10 fps on 8 cameras though.)

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Laurent Pinchart
Hi Denis,

On Tuesday 04 May 2010 17:37:13 Dennis Muhlestein wrote:
 On 04/29/2010 12:40 PM, Dennis Muhlestein wrote:
  One idea I've been toying with is to add a semaphore around submitting
  the URBs.
  
  In uvc_video.c, where the URBs are submitted, I'd acquire a semephore
  for each device currently submitting URBs. The semaphore would limit the
  number of devices to whatever number I decide can safely submit URBs
  simultaneously on the bus without throwing out of space errors.
  
  I did an initial test of this and it looks promising. I can configure
  all the cameras. As long as I don't submit the URBs for the number of
  devices beyond that which will work at the same time, the other cameras
  simply drop the data.
  
  I'm not sure the best places to control the locking and unlocking of the
  semaphore are. Right now, I lock it before submitting URBs in
  uvc_init_video. In uvc_video_complete, I unlock it and relock it if the
  buffer is complete (allowing another camera to acquire it and capture a
  frame).
  
  Anyway, it isn't working perfectly yet but I think I can debug it and at
  least get to a point where I know if it's worth pursuing. I'm curious if
  anyone can provide thoughts or alternatives.
 
 I have two solutions that I've come up with this so far.
 1) Modify the uvc_video.c to queue urbs in a list in the urb completion
 handler.  A driver level semaphore controls the number of currently
 submitting cameras.  You can adjust the initial sem_count in
 uvc_driver.c.  Ideally, that would be a module parameter but I'm just
 getting things to work.
 
 I found capture speeds quite a bit slower than I'd like with this method
 though.  I can capture with 8 cameras at 10 FPS without overwhelming the
 ISO bus but if I change to 15 FPS I can only capture with 3 cameras at a
 time.  Adding the patch, I then can configure and capture from all 8
 cameras running at 15 FPS but only submitting URBs for 3 at a time.
 Depending on how many frames I let those cameras capture I got captured
 frame rates from around 4 to 6.
 
 I'm attaching the patch in case anyone wants to play with this or
 suggest ways to improve it.

Interesting approach, but definitely a hack. I'm not sure if it has a chance 
to make it to the driver.

 One thing I had a problem with is that it seems some of the capture images
 are corrupted. This patch was against 2.6.27.2. A little old I know but I'm
 working on an embedded device.
 
 2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution
 actually lets me capture on all 8 cameras at 15 FPS.  This has other
 implications though and I'm not sure it is a very good solution.  It
 does tell me that perhaps the Linux ISO scheduler could perhaps use look
 through.  One major discrepancy is that bandwidth is allocated based on
 the max packet size I think but for my compressed images (MJPG) each
 frame is only a fraction of the max allocated bandwidth.

That might be the real culprit. Maybe the camera is simply requesting too much 
bandwidth compared to its real requirements.

Can you check whether the video streaming interface has multiple alternate 
settings ? If so, which one does the uvcvideo driver chose ? You can get the 
information from the kernel log if you set the UVC_TRACE_VIDEO flag.

If the camera request an alternate setting that isn't the smallest, you might 
try to experiment with hardcoding a lower bandwidth in uvc_video_init.

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Quickcam Pro 9000 LED control

2010-05-07 Thread Laurent Pinchart
Hi Alan,

On Thursday 29 April 2010 19:23:19 Alan wrote:
 Thanks a lot, I ran udevadm and now the led control is working :)

Great. Glad to hear it now works.

 By the way, I just had a thought that it would be cool to export the
 led control of the webcam using leds-class:
 
 http://www.mjmwired.net/kernel/Documentation/leds-class.txt

I've thought about it, but that would be difficult. The uvcvideo driver has no 
idead that the control actually controls a LED. Having access to that 
information isn't straightforward, as it's not reported by the device.

-- 
Regards,

Laurent Pinchart
___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel


Re: [Linux-uvc-devel] Multiple camera framerate.

2010-05-07 Thread Dennis Muhlestein

On 05/07/2010 09:10 AM, Laurent Pinchart wrote:



I'm attaching the patch in case anyone wants to play with this or
suggest ways to improve it.


Interesting approach, but definitely a hack. I'm not sure if it has a chance
to make it to the driver.


Most definitely a hack.  I just included it because someone had 
expressed interest in the solution.  If there was a way to do it faster, 
it might be a good thing to pursue but I can't think of a good way to 
synchronize the frames in such a way to not lose a lot of each frame 
(like you mentioned in your other response) after continuing urb 
submission and waste a lot of transfer bandwidth.





One thing I had a problem with is that it seems some of the capture images
are corrupted. This patch was against 2.6.27.2. A little old I know but I'm
working on an embedded device.

2) I modified ehci-sched.c to not raise the ENOSPC error.  That solution
actually lets me capture on all 8 cameras at 15 FPS.  This has other
implications though and I'm not sure it is a very good solution.  It
does tell me that perhaps the Linux ISO scheduler could perhaps use look
through.  One major discrepancy is that bandwidth is allocated based on
the max packet size I think but for my compressed images (MJPG) each
frame is only a fraction of the max allocated bandwidth.


That might be the real culprit. Maybe the camera is simply requesting too much
bandwidth compared to its real requirements.

Can you check whether the video streaming interface has multiple alternate
settings ? If so, which one does the uvcvideo driver chose ? You can get the
information from the kernel log if you set the UVC_TRACE_VIDEO flag.



The camera is definitely selecting more bandwidth than it needs.  The 
size of each image does vary when I'm streaming MJPG images.  It is 
usually somewhere along the lines of 110-150K at maximum resolution 
(1280x1024 for these cameras).

Perhaps this is the problem with these cameras:
Here is the information from the Uncompressed video format descriptor 
for the 1280x1024 YUYV data:

dwMaxVideoFrameBufferSize 2621440
And here is the information returned from the MJPG descriptor for the 
same sized compressed image:

dwMaxVideoFrameBufferSize 2621440

Logitech C500 by the way.


If the camera request an alternate setting that isn't the smallest, you might
try to experiment with hardcoding a lower bandwidth in uvc_video_init.



I think I actually tried that a while back when I first started playing 
with this issue.  If I recall, selecting a lower bandwidth interface 
caused a mismatch between the information in each packets and the 
placement of the frame headers in the UVC protocol.  Result was it never 
dequeued a buffer.  I'm not sure that isn't worth investigating further 
though.  I may have done something wrong.


Thank for your analysis of this.  Helps to have more than just my eyes 
look over the issue.


In the mean time, I'm having quite good luck with my EHCI hack.  I just 
let all the ISO frames through and I get all the images from all 8 
cameras back at 15 FPS.  I may come back to this problem before I'm all 
the way finished and see if there is a way to button this up cleaner but 
temporarily the solution is working for me.


-Dennis


___
Linux-uvc-devel mailing list
Linux-uvc-devel@lists.berlios.de
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel