Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Hans Verkuil

 Karicheri, Muralidharan a écrit :

 We need
 streaming capability in the driver. This is how our driver works
 with mt9t031 sensor
  raw-bus (10 bit)
 vpfe-capture  - mt9t031 driver
  ||
  V  V
VPFEMT9T031

 VPFE hardware has internal timing and DMA controller to
 copy data frame by frame from the sensor output to SDRAM.
 The PCLK form the sensor is used to generate the internal
 timing.
 So, what is missing in the driver apart from the ability to specify
 a frame-rate?

 [MK] Does the mt9t031 output one frame (snapshot) like in a camera or
 can it output frame continuously along with PCLK, Hsync and Vsync
 signals like in a video streaming device. VPFE capture can accept frames
 continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and
 output frames to application using QBUF/DQBUF. In our implementation, we
 have timing parameters for the sensor to do streaming at various
 resolutions and fps. So application calls S_STD to set these timings. I
 am not sure if this is an acceptable way of implementing it. Any
 comments?

 PCLK, HSYNC, VSYNC are generated by the CMOS sensor. I don't think you
 can set the timings. Depending on sensor settings, pixel clock speed etc
 .., the frame rate will vary.

 You could perhaps play with the CMOS sensor registers so that when
 settings a standard, the driver somehow set the various exposition
 parameter and pll settings to get a specified framerate.

 This will vary with each sensor and each platform (because of
 pixelclock). More over, chances are that it will be conflicting with
 other control.

 For example if you set a fixed gain and autoexpo, some sensor will see
 a drop in fps under low light conditions. I think this kind of
 arbitration  should be left to userspace.

 Unless the sensor supports a specific standard, I don't think the driver
 should try to make behind the scene modification to camera sensor
 register in response to a S_STD ioctl.

The S_STD call is hopelessly inadequate to deal with these types of
devices. What is needed is a new call that allows you to set the exact
timings you want: frame width/height, back/front porch, h/vsync width,
pixelclock. It is my opinion that the use of S_STD should be limited to
standard definition type inputs, and not used for other devices like
sensors or HD video.

Proposals for such a new ioctl are welcome :-)

Regards,

 Hans


 JP François


 Thanks

 Murali

 Thanks
 Guennadi
 ---
 Guennadi Liakhovetski, Ph.D.
 Freelance Open-Source Software Developer
 http://www.open-technology.de/

 ___
 Davinci-linux-open-source mailing list
 davinci-linux-open-sou...@linux.davincidsp.com
 http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source



 ___
 Davinci-linux-open-source mailing list
 davinci-linux-open-sou...@linux.davincidsp.com
 http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source




-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Hans de Goede



On 06/11/2009 11:33 AM, Hans Verkuil wrote:


On 06/11/2009 10:35 AM, Hans Verkuil wrote:


snip (a lot)


Hmm,

Why would we want the *application* to set things like this *at all* ?
with sensors hsync and vsync and other timing are something between
the bridge and the sensor, actaully in my experience the correct
hsync / vsync timings to program the sensor to are very much bridge
specific. So IMHO this should not be exposed to userspace at all.

All userspace should be able to control is the resolution and the
framerate. Although controlling the framerate in many cases also
means controlling the maximum exposure time. So in many cases
one cannot even control the framerate. (Asking for 30 fps in an
artificially illuminated room will get you a very dark, useless
picture, with most sensors). Yes this means that with cams with
use autoexposure (which is something which we really want where ever
possible), the framerate can and will change while streaming.


I think we have three possible use cases here:

- old-style standard definition video: use S_STD



Ack


- webcam-like devices: a combination of S_FMT and S_PARM I think? Correct
me if I'm wrong. S_STD is useless for this, right?



Ack


- video streaming devices like the davinci videoports where you can hook
up HDTV receivers or FPGAs: here you definitely need a new API to setup
the streaming parameters, and you want to be able to do that from the
application as well. Actually, sensors are also hooked up to these devices
in practice. And there you also want to be able to setup these parameters.
You will see this mostly (only?) on embedded platforms.



I agree we need an in kernel API for this, but why expose it to 
userspace, as you say this will only happen on embedded systems, 
shouldn't the info then go in a board_info file / struct ?


Regards,

Hans
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Hans Verkuil



 On 06/11/2009 11:33 AM, Hans Verkuil wrote:

 On 06/11/2009 10:35 AM, Hans Verkuil wrote:

 snip (a lot)

 Hmm,

 Why would we want the *application* to set things like this *at all* ?
 with sensors hsync and vsync and other timing are something between
 the bridge and the sensor, actaully in my experience the correct
 hsync / vsync timings to program the sensor to are very much bridge
 specific. So IMHO this should not be exposed to userspace at all.

 All userspace should be able to control is the resolution and the
 framerate. Although controlling the framerate in many cases also
 means controlling the maximum exposure time. So in many cases
 one cannot even control the framerate. (Asking for 30 fps in an
 artificially illuminated room will get you a very dark, useless
 picture, with most sensors). Yes this means that with cams with
 use autoexposure (which is something which we really want where ever
 possible), the framerate can and will change while streaming.

 I think we have three possible use cases here:

 - old-style standard definition video: use S_STD


 Ack

 - webcam-like devices: a combination of S_FMT and S_PARM I think?
 Correct
 me if I'm wrong. S_STD is useless for this, right?


 Ack

 - video streaming devices like the davinci videoports where you can hook
 up HDTV receivers or FPGAs: here you definitely need a new API to setup
 the streaming parameters, and you want to be able to do that from the
 application as well. Actually, sensors are also hooked up to these
 devices
 in practice. And there you also want to be able to setup these
 parameters.
 You will see this mostly (only?) on embedded platforms.


 I agree we need an in kernel API for this, but why expose it to
 userspace, as you say this will only happen on embedded systems,
 shouldn't the info then go in a board_info file / struct ?

These timings are not fixed. E.g. a 720p60 video stream has different
timings compared to a 1080p60 stream. So you have to be able to switch
from userspace. It's like PAL and NTSC, but then many times worse :-)

Regards,

 Hans

-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Karicheri, Muralidharan
Hi,

In our experience, I have following cases where we need to set the device to 
capture/display at certain resolution and fps.

Capture side
-
1) IP Netcam applications. If the solution uses Micron sensors such as 
MT9T001/031/P031 etc, application will require to do streaming at various 
resolution ranging from VGA, SVGA to UXGA or 480p, 576p to 1080p. In these case 
what we have implemented is a table of timings looked up by the STD string. 
Though this is not a standard, it helps in achieving streaming at desired frame 
rate. The exposure used is to get the desired frame rate and video quality. The 
VPFE has modules to fine tune the image quality.

2) TVP7002 decoder chip has following table for various analog video standard 
that it supports.
SDTV(YPbPr component) - 480i/576i
EDTV (YPbPr component) - 480p/576p
HDTV (YPbPr component) - 7...@50/60, 10...@50/60, 10...@50/60
PC graphics (RGB Component) - VGA to UXGA

3) TVP5146/TVP5147 - supports NTSC/PAL standards from SDTV

4) camera applications that do preview and take snapshots. We don't support it 
in Linux, but have solutions based on other OS.

Display side

1)VPBE (video processing back end) can support NTSC/PAL timing signals directly 
from the SOC.

2) By connecting a daughter card that does voltage translation, to the digital 
LCD port of VPBE, it can support PC graphics timings. Examples are logic PD 
LCD/ Avnet LCD kits that can be connected using these daughter card.

3) The Digital LCD port of VPBE can generate BT.656/BT.1120 timings. So you 
could connect a encoder chip such as THS8200 to generate 720P/1080 YPbPr 
component outputs. This can support any encoder chip that can accepts YUV data 
or RGB666 or RGB888 data along with timings signals and output PC graphic or 
YPbPr component output or standard NTSC/PAL outputs.

As you can see, S_STD can be used only for 3) on the capture side and 1) on the 
display side since it doesn't specify all the above timings and is not quite 
useful. So we need an API that can do the following...

Query available timings settings from the encoder/decoder/sensors. Since these 
timings are not relevant to application domain, it can be defined in either in 
the driver and only expose following as part of the query. Driver uses this to 
look up the correct timing. 

1) resolution (VGA, 720p, 1080p, 576p etc)
2) frame rate 

Set the timing by specifying
Detect the signal for capture similar to QUERYSTD??
Get the current timing...

Is VIDIOC_S_PARM  G_PARM added for this purpose. May be this might need to be 
enhanced for this purpose to add the resolution as well or add a new set of 
APIs...


Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
email: m-kariche...@ti.com

-Original Message-
From: linux-media-ow...@vger.kernel.org [mailto:linux-media-
ow...@vger.kernel.org] On Behalf Of Hans Verkuil
Sent: Thursday, June 11, 2009 6:40 AM
To: Hans de Goede
Cc: Jean-Philippe François; Karicheri, Muralidharan; davinci-linux-open-
sou...@linux.davincidsp.com; Muralidharan Karicheri; Guennadi Liakhovetski;
linux-media@vger.kernel.org
Subject: Re: mt9t031 (was RE: [PATCH] adding support for setting bus
parameters in sub device)




 On 06/11/2009 11:33 AM, Hans Verkuil wrote:

 On 06/11/2009 10:35 AM, Hans Verkuil wrote:

 snip (a lot)

 Hmm,

 Why would we want the *application* to set things like this *at all* ?
 with sensors hsync and vsync and other timing are something between
 the bridge and the sensor, actaully in my experience the correct
 hsync / vsync timings to program the sensor to are very much bridge
 specific. So IMHO this should not be exposed to userspace at all.

 All userspace should be able to control is the resolution and the
 framerate. Although controlling the framerate in many cases also
 means controlling the maximum exposure time. So in many cases
 one cannot even control the framerate. (Asking for 30 fps in an
 artificially illuminated room will get you a very dark, useless
 picture, with most sensors). Yes this means that with cams with
 use autoexposure (which is something which we really want where ever
 possible), the framerate can and will change while streaming.

 I think we have three possible use cases here:

 - old-style standard definition video: use S_STD


 Ack

 - webcam-like devices: a combination of S_FMT and S_PARM I think?
 Correct
 me if I'm wrong. S_STD is useless for this, right?


 Ack

 - video streaming devices like the davinci videoports where you can hook
 up HDTV receivers or FPGAs: here you definitely need a new API to setup
 the streaming parameters, and you want to be able to do that from the
 application as well. Actually, sensors are also hooked up to these
 devices
 in practice. And there you also want to be able to setup these
 parameters.
 You will see this mostly (only?) on embedded platforms.


 I agree we need an in kernel API for this, but why expose it to
 userspace, as 

RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Karicheri, Muralidharan


 - video streaming devices like the davinci videoports where you can hook
 up HDTV receivers or FPGAs: here you definitely need a new API to setup
 the streaming parameters, and you want to be able to do that from the
 application as well. Actually, sensors are also hooked up to these
devices
 in practice. And there you also want to be able to setup these parameters.
 You will see this mostly (only?) on embedded platforms.


I agree we need an in kernel API for this, but why expose it to
userspace, as you say this will only happen on embedded systems,
shouldn't the info then go in a board_info file / struct ?

No we still need a way for application to set these timings at the device. For 
example, it needs to tell a TVP7002 device to scan at 720p/1080p similar to 
S_STD. From user prespective, it is just like S_STD. See my email on the 
details...

Regards,

Hans

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Karicheri, Muralidharan
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:

 So how do I know what frame-rate I get? Sensor output frame rate depends
 on the resolution of the frame, blanking, exposure time etc.

This is not supported.

I am still not clear. You had said in an earlier email that it can support 
streaming. That means application can stream frames from the capture device.
I know you don't have support for setting a specific frame rate, but it must be 
outputting frame at some rate right?

Here is my usecase.

open capture device,
set resolutions (say VGA) for capture (S_FMT ???)
request buffer for streaming  mmap  QUERYBUF
start streaming (STREAMON)
DQBUF/QBUF in a loop - get VGA buffers at some fps.
STREAMOFF
close device

Is this possible with mt9t031 available currently in the tree? This requires 
sensor device output frames continuously on the bus using PCLK/HSYNC/VSYNC 
timing to the bridge device connected to the bus. Can you give a use case like 
above that you are using. I just want to estimate how much effort is required 
to add this support in the mt9t031 driver.

Thanks

Murali

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Guennadi Liakhovetski
On Thu, 11 Jun 2009, Karicheri, Muralidharan wrote:

 On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:
 
  So how do I know what frame-rate I get? Sensor output frame rate depends
  on the resolution of the frame, blanking, exposure time etc.
 
 This is not supported.
 
 I am still not clear. You had said in an earlier email that it can 
 support streaming. That means application can stream frames from the 
 capture device.
 I know you don't have support for setting a specific frame rate, but it 
 must be outputting frame at some rate right?

I am sorry, I do not know how I can explain myself clearer.

Yes, you can stream video with mt9t031.

No, you neither get the framerate measured by the driver nor can you set a 
specific framerate. Frames are produced as fast as it goes, depending on 
clock settings, frame size, black areas, autoexposure.

Thanks
Guennadi

 
 Here is my usecase.
 
 open capture device,
 set resolutions (say VGA) for capture (S_FMT ???)
 request buffer for streaming  mmap  QUERYBUF
 start streaming (STREAMON)
 DQBUF/QBUF in a loop - get VGA buffers at some fps.
 STREAMOFF
 close device
 
 Is this possible with mt9t031 available currently in the tree? This requires 
 sensor device output frames continuously on the bus using PCLK/HSYNC/VSYNC 
 timing to the bridge device connected to the bus. Can you give a use case 
 like above that you are using. I just want to estimate how much effort is 
 required to add this support in the mt9t031 driver.
 
 Thanks
 
 Murali
 
 Thanks
 Guennadi
 ---
 Guennadi Liakhovetski, Ph.D.
 Freelance Open-Source Software Developer
 http://www.open-technology.de/
 

---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-11 Thread Karicheri, Muralidharan

I am sorry, I do not know how I can explain myself clearer.

Thanks for helping me to understand better :)
Yes, you can stream video with mt9t031.

No, you neither get the framerate measured by the driver nor can you set a
specific framerate. Frames are produced as fast as it goes, depending on
clock settings, frame size, black areas, autoexposure.

Ok. It is now clear to me. 

Thanks for all your help.

Thanks
Guennadi


 Here is my usecase.

 open capture device,
 set resolutions (say VGA) for capture (S_FMT ???)
 request buffer for streaming  mmap  QUERYBUF
 start streaming (STREAMON)
 DQBUF/QBUF in a loop - get VGA buffers at some fps.
 STREAMOFF
 close device

 Is this possible with mt9t031 available currently in the tree? This
requires sensor device output frames continuously on the bus using
PCLK/HSYNC/VSYNC timing to the bridge device connected to the bus. Can you
give a use case like above that you are using. I just want to estimate how
much effort is required to add this support in the mt9t031 driver.

 Thanks

 Murali

 Thanks
 Guennadi
 ---
 Guennadi Liakhovetski, Ph.D.
 Freelance Open-Source Software Developer
 http://www.open-technology.de/


---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-10 Thread Karicheri, Muralidharan


 We need
 streaming capability in the driver. This is how our driver works
 with mt9t031 sensor
raw-bus (10 bit)
 vpfe-capture  - mt9t031 driver
||
V  V
  VPFEMT9T031

 VPFE hardware has internal timing and DMA controller to
 copy data frame by frame from the sensor output to SDRAM.
 The PCLK form the sensor is used to generate the internal
 timing.

So, what is missing in the driver apart from the ability to specify
a frame-rate?

[MK] Does the mt9t031 output one frame (snapshot) like in a camera or can it 
output frame continuously along with PCLK, Hsync and Vsync signals like in a 
video streaming device. VPFE capture can accept frames continuously from the 
sensor synchronized to PCLK, HSYNC and VSYNC and output frames to application 
using QBUF/DQBUF. In our implementation, we have timing parameters for the 
sensor to do streaming at various resolutions and fps. So application calls 
S_STD to set these timings. I am not sure if this is an acceptable way of 
implementing it. Any comments?

Thanks

Murali

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-10 Thread Guennadi Liakhovetski
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:

 
 
  We need
  streaming capability in the driver. This is how our driver works
  with mt9t031 sensor
   raw-bus (10 bit)
  vpfe-capture  - mt9t031 driver
   ||
   V  V
 VPFEMT9T031
 
  VPFE hardware has internal timing and DMA controller to
  copy data frame by frame from the sensor output to SDRAM.
  The PCLK form the sensor is used to generate the internal
  timing.
 
 So, what is missing in the driver apart from the ability to specify
 a frame-rate?
 
 [MK] Does the mt9t031 output one frame (snapshot) like in a camera or 
 can it output frame continuously along with PCLK, Hsync and Vsync 
 signals like in a video streaming device. VPFE capture can accept frames 
 continuously from the sensor synchronized to PCLK, HSYNC and VSYNC and 
 output frames to application using QBUF/DQBUF. In our implementation, we 
 have timing parameters for the sensor to do streaming at various 
 resolutions and fps. So application calls S_STD to set these timings. I 
 am not sure if this is an acceptable way of implementing it. Any 
 comments?

Yes, it is streaming.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


RE: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)

2009-06-10 Thread Guennadi Liakhovetski
On Wed, 10 Jun 2009, Karicheri, Muralidharan wrote:

 So how do I know what frame-rate I get? Sensor output frame rate depends 
 on the resolution of the frame, blanking, exposure time etc.

This is not supported.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html