Laurent Pinchart wrote:
> Hi Tim,
>
> On Thursday 13 August 2009 01:32:22 Tim Harvey wrote:
>   
>> I'm looking at using multiple UVC cameras which likely will not support
>> MJPEG.  I'm curious how to calculate my max framerate for a given
>> resolution and number of cameras based on a single USB 2.0 bus and thus
>> am trying to understand the calculation and the architecture of UVC and
>> the uvcvideo driver.
>>
>> I recall reading somewhere (perhaps the uvccapture source) the uvcvideo
>> driver (or perhaps UVC itself, or perhaps a limitation of uvccapture)
>> was not able to capture still images from cameras and instead it would
>> put the camera in streaming mode and grab frames as directed.
>>     
>
> UVC specifies 3 different methods to support still capture. The most simple 
> one (method 1) is to just take the next frame from a video stream. There's 
> nothing required on the camera side for this, making it a very simple and 
> popular method among camera developers.
>
> Two other methods are also possible. One of them (method 2) involves 
> temporarily stopping the video stream, negotiating still image capture 
> parameters, get a still image from the camera through the video stream and 
> then resuming video streaming. The other (method 3) uses a separate bulk 
> endpoint to stream the still image.
>
> I've never seen any camera implementing the last method. Method 2 is 
> supported 
> by several webcams, but not by the UVC driver. One main obstacle to getting 
> that supported by the driver is the lack of a still image capture API in v4l2.
>
>   
Great explanation - thanks!

What would the risks be in using method 1 but turning streaming on/off 
on a device that may support a min framerate of 30fps fast enough to 
grab say 1fps and thus conserve USB bus bandwidth?  Is there a rather 
long latency in being able to start/stop streaming from the current UVC 
devices?

How can I determine what the USB bus utilization is for a specific 
device that I'm testing?  I'm not quite clear if there is anything in 
/proc/bus/usb/devices that can be decrypted to provide this 
information.  I'm up for hacking in some printk's into usbcore if thats 
what it would take.

Thanks,

Tim
>> If this is true then even if I only want to grab 1 frame per second from
>> each camera (to lower overall USB bandwidth) the bus would still need to
>> support the full-framerate stream from each camera. Is this true?
>>     
>
> Even if both the driver and the device supported still capture, the UVC 
> specification doesn't clearly state that still image capture is supported 
> while no video stream is active, so it's not clear how the camera would 
> behave 
> in such a case.
>
>   
>> Is the streaming framerate a per-camera feature so that I can't necessarily
>> count on any given UVC camera to allow me to stream only 1fps on the bus?
>>     
>
> It's a camera feature, yes. You need to check the USB descriptors for the 
> available frame rates.
>
> Furthermore, the required bandwidth is also a per-camera feature. Some 
> cameras 
> will require a lower bandwidth when you lower the frame rate, some won't. 
> This 
> depends on the size of the camera frame buffer, which is usually quite small 
> (maybe a few lines). The camera would need to be able to buffer a full frame 
> to send low frame rates at lower bandwidths.
>
> --
> Regards,
>  
> Laurent Pinchart
>   

_______________________________________________
Linux-uvc-devel mailing list
[email protected]
https://lists.berlios.de/mailman/listinfo/linux-uvc-devel

Reply via email to