Re: Exposure set bug in stv06xx driver
Hi, On 08/23/2009 02:50 PM, James Blanford wrote: Well that was quick. These issues as well as the stream drops were remedied in 2.6.31-rc7, ?? I'm the author of the pb100 (046d:0840) support in the stv06xx driver, and AFAIK there have been no changes to it recently. except exposure and gain still cannot be set with v4l2ucp. They can be set if you disable autogain first, just uncheck the checkbox. As a bonus, I found the autogain white list in v4l2-apps/libv4l/libv4lconvert/control/libv4lcontrol.c. Is there a white list to turn on the gain and exposure manual controls? This whitelist is to enable software automatic gain + exposure for camera's which lack this in hardware (or we don't know how to do this), this is not relevant for the 046d:0840. While the autogain works great, the autoloss doesn't. Gain increases automatically, but is not decreased when light levels rise. It does, but it is slow when decreasing, give it some time. Also, updating the exposure readout in v4l2ucp decreases the exposure about 10% and incorrectly reports an exposure somewhere between the original level and the changed level. E.g., click the exposure update button and the exposure drops from 2 to 18000 and reports 19000. ??? I've not seen any problems like these. Note that the values returned when reading the controls are cached values of the last value set, not actually register values. Also the range for exposure is only 0 - 511, where are you getting values like 18000 - 2 from ? Are you sure you are using the in kernel gspcav2 stv06xx driver ? Hmm, you also write: is there any possibility of enabling autogain? Yet this already is enabled, does your 046d:0840, perhaps have a different sensor, mine says when plugged in: STV06xx: Photobit pb0100 sensor detected I'm not used to logitech having different camera's with the same usb-id, but you never know. Regards, Hans Thanks for all the work. - Jim On Sat, 22 Aug 2009 15:10:31 -0400 James Blanfordjhblanf...@gmail.com wrote: Quickcam Express 046d:0840 Driver versions: v 2.60 from 2.6.31-rc6 and v 2.70 from gspca-c9f3938870ab Problem: Overexposure and horizontal orange lines in cam image. Exposure and gain controls in gqcam and v4l2ucp do not work. By varying the default exposure and gain settings in stv06xx.h, the lines can be orange and/or blue, moving or stationary or a fine grid. Workaround: Using the tool set_cam_exp, any exposure setting removes the visual artefacts and reduces the image brightness for a given set of gain and exposure settings. By default: Aug 21 14:22:02 blackbart kernel: STV06xx: Writing exposure 5000, rowexp 0, srowexp 0 Note what happens when I set the default exposure to 1000: Aug 21 20:44:23 blackbart kernel: STV06xx: Writing exposure 1000, rowexp 0, srowexp 139438350 By the way, is there any possibility of enabling autogain? Thanks for your interest, - Jim -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: gspca: Trust webcam WB 300P ID 093a:2608 doesn't work
Hi, On 08/14/2009 04:00 PM, Claudio Chimera wrote: Hello Hans, thanks for your reply. I've connected only the webcam, no other devices. I've tried to connect via un USB HUB but the result is always the same: Aug 14 15:56:50 cchi-desktop kernel: [ 8434.924045] gspca: usb_submit_urb [0] err -28 Then you most likely still have something using usb bandwidth maybe some integrated pheriphial ? What is the output of lsusb? Regards, Hans This web-cam is never worked using Linux, but it is reported full working. Thanks Claudio Il giorno mer, 12/08/2009 alle 16.53 +0200, Hans de Goede ha scritto: Hi, You are trying to use the webcam on the same usb root controller as an usb-audio device and there is not enough bandwidth for both, try removing the usb audio device. Regards, Hans On 08/11/2009 07:04 PM, Claudio Chimera wrote: Hello, I'm trying to use the Trust webcam WB 300P (ID 093a:2608 ) unsuccessful. The complete lsusb command is following: Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 003 Device 002: ID 093a:2608 Pixart Imaging, Inc. Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub When I use amsn, I can select the webcam but I get an error (unable to capture ...) The /var/log/message is following: Aug 11 18:35:06 cchi-desktop kernel: [ 3447.714061] usb 1-2.4: new full speed USB device using ehci_hcd and address 3 Aug 11 18:35:06 cchi-desktop kernel: [ 3447.836067] usb 1-2.4: configuration #1 chosen from 1 choice Aug 11 18:35:06 cchi-desktop kernel: [ 3448.112057] gspca: main v2.3.0 registered Aug 11 18:35:06 cchi-desktop kernel: [ 3448.170206] gspca: probing 093a:2608 Aug 11 18:35:06 cchi-desktop kernel: [ 3448.180041] gspca: probe ok Aug 11 18:35:06 cchi-desktop kernel: [ 3448.180041] gspca: probing 093a:2608 Aug 11 18:35:06 cchi-desktop kernel: [ 3448.180041] gspca: probing 093a:2608 Aug 11 18:35:06 cchi-desktop kernel: [ 3448.180041] usbcore: registered new interface driver pac7311 Aug 11 18:35:06 cchi-desktop kernel: [ 3448.180041] pac7311: registered Aug 11 18:35:07 cchi-desktop kernel: [ 3448.724060] usbcore: registered new interface driver snd-usb-audio Aug 11 18:35:08 cchi-desktop pulseaudio[3943]: alsa-util.c: Device hw:2 doesn't support 44100 Hz, changed to 16000 Hz. Aug 11 18:35:08 cchi-desktop pulseaudio[3943]: alsa-util.c: Device hw:2 doesn't support 2 channels, changed to 1. Aug 11 18:44:25 cchi-desktop kernel: [ 4007.040063] gspca: usb_submit_urb [0] err -28 Aug 11 18:44:40 cchi-desktop kernel: [ 4022.040062] gspca: usb_submit_urb [0] err -28 Thanks Claudio -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l: problem with 2x downscaling + Labtec Webcam 2200
On 08/15/2009 08:53 AM, Németh Márton wrote: Hello Hans, I am using your libv4l 0.6.0 [1] together with the driver gspca_pac7311 from Linux kernel 2.6.31-rc4 and with Labtec Webcam 2200 hardware [2]. I am using the svv.c [3] to display the webcam image. When I'm using the webcam in 640x480 the image is displayed correctly. However, when I set the resolution to 320x240, the image is not correct: the image contains horizontal lines and doubled vertically. I guess the conversion from 640x480 is not done just the pixels are shown as it would be 320x240. Hi, This is a known problem in 0.6.0 fixed by this commit: http://linuxtv.org/hg/~hgoede/libv4l/rev/89fba654c7ea You can get a snapshot of what will eventually (soonish) become 0.6.1 here: http://people.atrpms.net/~hdegoede/libv4l-0.6.1-test.tar.gz Upgrading to this should fix your problems. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Philips SPC230NC: wrong colors / image format?
Hi, On 07/17/2009 06:28 PM, Andi Drebes wrote: Hi! I tried to find a mailing list for the v4l compatibility library, but I didn't find any. You can use Linux Media Mailing List linux-media@vger.kernel.org, guess I should put that in the README. This is why I'm sending this email directly to you. No problem. Some weeks ago I bought a Philips SPC230NC webcam. It seems that the camera is supported by the pac7311 driver. In order to use the cam in some older applications, I tried out the compatibility library. It almost works; the only problem is that the video is distorted and that the colors are not right. Here's what a keyboard looks like in camorama: http://drebesium.org/~hackbert/Webcam-1247846047.png Ah that is a bug in camorama, I've attached a patch which fixes this. I've also added a patch which make camorama use libv4l directly so you do not need todo the LD_PRELOAD thingie. Regards, Hans I'm using a 2.6.30.1 kernel on debian lenny. I tried out libv4l-0.6.0 and libv4l-0.6.1-test. They both provide the same results. As far as the hardware is concerned, lsusb tells me: 093a:262c Pixart Imaging, Inc. Dmesg does not show any errors: [ 3216.124322] gspca: main v2.5.0 registered [ 3216.128254] gspca: probing 093a:262c [ 3216.145196] gspca: probe ok I used the following command to start camorama: LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/src/libv4l-0.6.0/libv4l1/:/usr/src/libv4l-0.6.0/libv4l2/:/usr/src/libv4l-0.6.0/libv4lconvert/ LD_PRELOAD=/usr/src/libv4l-0.6.0/libv4l1/v4l1compat.so camorama Do you have any idea what might be wrong? Again, sorry to bug you directly with this. If there's any mailinglist or something, I would be glad if you could indicate me the address. In that case, feel free to ignore the content of this mail. Thanks in advance, Andi --- camorama-0.19/src/callbacks.c 2007-09-16 15:36:55.0 +0200 +++ camorama-0.19.new/src/callbacks.c 2008-06-29 22:22:44.0 +0200 @@ -387,9 +387,6 @@ } } -cam-pixmap = gdk_pixmap_new (NULL, cam-x, cam-y, cam-desk_depth); -gtk_widget_set_size_request (glade_xml_get_widget (cam-xml, da), - cam-x, cam-y); /* * if(cam-read == FALSE) { @@ -441,6 +438,11 @@ * * } */ get_win_info (cam); + +cam-pixmap = gdk_pixmap_new (NULL, cam-x, cam-y, cam-desk_depth); +gtk_widget_set_size_request (glade_xml_get_widget (cam-xml, da), + cam-x, cam-y); + frame = 0; gtk_window_resize (GTK_WINDOW (glade_xml_get_widget (cam-xml, main_window)), 320, @@ -520,8 +522,14 @@ gtk_widget_show (about); } +void +camorama_filter_color_filter(void* filter, guchar *image, int x, int y, int depth); + static void apply_filters(cam* cam) { + /* v4l has reverse rgb order from what camora expect so call the color + filter to fix things up before running the user selected filters */ + camorama_filter_color_filter(NULL, cam-pic_buf, cam-x, cam-y, cam-depth); camorama_filter_chain_apply(cam-filter_chain, cam-pic_buf, cam-x, cam-y, cam-depth); #warning FIXME: enable the threshold channel filter // if((effect_mask CAMORAMA_FILTER_THRESHOLD_CHANNEL) != 0) --- camorama-0.19/src/filter.c 2007-09-16 14:48:50.0 +0200 +++ camorama-0.19.new/src/filter.c 2008-06-29 22:11:42.0 +0200 @@ -151,12 +151,12 @@ static void camorama_filter_color_init(CamoramaFilterColor* self) {} -static void +void camorama_filter_color_filter(CamoramaFilterColor* filter, guchar *image, int x, int y, int depth) { int i; char tmp; i = x * y; - while (--i) { + while (i--) { tmp = image[0]; image[0] = image[2]; image[2] = tmp; --- camorama-0.19/src/main.c2007-09-16 15:36:55.0 +0200 +++ camorama-0.19.new/src/main.c2008-06-29 22:20:04.0 +0200 @@ -224,8 +224,7 @@ /* get picture attributes */ get_pic_info (cam); -// set_pic_info(cam); -/* set_pic_info(cam); */ +set_pic_info (cam); cam-contrast = cam-vid_pic.contrast; cam-brightness = cam-vid_pic.brightness; cam-colour = cam-vid_pic.colour; --- camorama-0.19/src/v4l.c 2007-09-16 14:48:05.0 +0200 +++ camorama-0.19.new/src/v4l.c 2008-06-29 22:20:23.0 +0200 @@ -158,8 +158,8 @@ if(cam-debug) { g_message(SET PIC); } - //cam-vid_pic.palette = VIDEO_PALETTE_RGB24; - //cam-vid_pic.depth = 24; + cam-vid_pic.palette = VIDEO_PALETTE_RGB24; + cam-vid_pic.depth = 24; //cam-vid_pic.palette = VIDEO_PALETTE_YUV420P; if(ioctl(cam-dev, VIDIOCSPICT, cam-vid_pic) == -1) { if(cam-debug) { @@ -232,6 +232,8 @@ exit(0); } + cam-x = cam-vid_win.width; + cam-y = cam-vid_win.height; } void set_buffer(cam * cam)
Re: [PATCH 1/1] gspca: Add sn9c20x subdriver
Hi, First of all many many thanks for doings this! There are 4 issues with this driver, 2 of which are blockers: 1) The big one is the use of a custom debugging mechanism, please use the v4l standard debugging mechanism which is activated by the kernel config option VIDEO_ADV_DEBUG, please use this define to enable / disable the debugging features of this driver and use the standard VIDIOC_DBG_G_REGISTER and VIDIOC_DBG_S_REGISTER ioctl's instead of an sysfs interface. Note I'm not very familiar with these myself, please send any questions on this to the list. 2) : + switch (sd-sensor) { + case SENSOR_OV9650: + if (ov9650_init_sensor(gspca_dev) 0) + return -ENODEV; + info(OV9650 sensor detected); + break; + case SENSOR_OV9655: + if (ov9655_init_sensor(gspca_dev) 0) + return -ENODEV; + info(OV9655 sensor detected); + break; + case SENSOR_SOI968: + if (soi968_init_sensor(gspca_dev) 0) + return -ENODEV; + info(SOI968 sensor detected); + break; + case SENSOR_OV7660: + if (ov7660_init_sensor(gspca_dev) 0) + return -ENODEV; + info(OV7660 sensor detected); You are missing a break here! Which I found out because my only sn9c20x cam has ab ov7660 sensor + case SENSOR_OV7670: + if (ov7670_init_sensor(gspca_dev) 0) + return -ENODEV; + info(OV7670 sensor detected); + break; 3) My cam works a lot better with the standalone driver then with you're gspca version. With your version it shows a bayer pattern ish pattern over the whole picture as if the bayer pixel order is of, except that the colors are right so that is most likely not the cause. I'll investigate this further as time permits. 4) The evdev device creation and handling realyl belongs in the gspca core, as we can (and should) handle the snapshot button in other drivers too, but this is something which can be fixed after merging. Thanks Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Control IOCTLs handling
Hi, On 07/13/2009 08:21 PM, Karicheri, Muralidharan wrote: Hi, I need to implement some controls for my driver and would like to understand the control ioctl framework available today. I am not very sure how the control ioctls are to be implemented and it is not well defined in the specification. I have provided below my understanding of the below set of controls. I would like to hear what you think about the same. I see following controls defined for adjusting brightness, contrast etc. V4L2_CID_BRIGHTNESS integer Picture brightness, or more precisely, the black level. V4L2_CID_CONTRAST integer Picture contrast or luma gain. V4L2_CID_SATURATION integer Picture color saturation or chroma gain. V4L2_CID_HUEinteger Hue or color balance. I think these controls refer to the YUV color space. Y (luma) and UV (chroma) signals will be modified by above controls. Ack. V4L2_CID_DO_WHITE_BALANCE button This is an action control. When set (the value is ignored), the device will do a white balance and then hold the current setting. Contrast this with the boolean V4L2_CID_AUTO_WHITE_BALANCE, which, when activated, keeps adjusting the white balance. V4L2_CID_RED_BALANCEinteger Red chroma balance. V4L2_CID_BLUE_BALANCE integer Blue chroma balance. My understanding is these controls are applied to RGB color space. V4L2_CID_AUTO_WHITE_BALANCE is applicable where hardware is capable of adjusting the wb automatically. But V4L2_CID_DO_WHITE_BALANCE is used in conjunction with V4L2_CID_RED_BALANCE V4L2_CID_BLUE_BALANCE. i.e application set these values and they take effect when V4L2_CID_DO_WHITE_BALANCE is issued. So driver hold onto the current values until another set of above commands are issued. Erm, no, V4L2_CID_DO_WHITE_BALANCE is for hardware whitebalance too, but means do hardware whitebalance once and then hold the current correction factors. It is a really weird control, and I don't know if we haven drivers using it, it is best ignored. The V4L2_CID_RED_BALANCE controls are meant to be appplied immediately. But one question I have is (if the above is correct), why there is no V4L2_CID_GREEN_BALANCE ?? I guess these controls were introduced for some hardware which had a fixed green gain ? I don't see any control IDs available for Bayer RGB color space. In our video hardware, there is a set of Gain values that can be applied to the Bayer RGB data. We can apply them individually to R, Gr, Gb or B color components. So I think we need to have 4 more controls defined for doing white balancing in the Bayer RGB color space that is applicable for sensors (like MT9T031) and image tuning hardware like the VPFE CCDC IPIPE. Define following new controls for these in Bayer RGB color space White Balance (WB) controls?? V4L2_CID_BAYER_RED_BALANCE integer Bayer Red balance. V4L2_CID_BAYER_BLUE_BALANCE integer Bayer Blue balance. V4L2_CID_BAYER_GREEN_R_BALANCE integer Bayer Gr balance. V4L2_CID_BAYER_GREEN_B_BALANCE integer Bayer Gb balance. There is also an offset value defined per color which is like adjusting the black level in the video image data. It is subtracted from the image byte. What you call this ? Should we define a new control, V4l2_CID_BAYER_OFFSET ?? I can't help but wonder if we should export all these as controls. One can probably export about 90% of the registers of a sensor as controls, but then why write a driver at all, why not just give the user an application to set the registers himself them ? When it comes to controls, less is more IMHO. So the question is can't we give these registers a sensible default setting and leave it at that? And currently the answer to that is yes, there currently are 2 ways to do whitebalance for sensors under Linux: 1) The sensor does it in hardware (using per color gains like above) 2) libv4l does whitebalancing in software, in this case case a software gain is used as we can control that very precisely and libv4l does not know the exact gain factor (and has no way to find out) of per color gains exported through controls, so we just apply a software per color gain, which we can control exactly. So currently the best thing todo is, either: a) make the sensor do hardware whitebalance if it can (much prefered), or: b) set all the per color gains in their default / middle position and handle the whitebalancing fully in software. This applies even more to the per color offset's, I really see little use in exporting this to the end-user. You should look at controls as knobs the end user may want to tweak, if it is not something the end-user could want to / should tweak it should not be a control. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: RFC: howto handle driver changes which require libv4l x.y ?
Hi, On 07/07/2009 04:35 PM, Mauro Carvalho Chehab wrote: Em Tue, 7 Jul 2009 15:55:59 +0200 Erik Andrénerik.and...@gmail.com escreveu: 2009/7/7 Hans de Goedehdego...@redhat.com: Hi All, So recently I've hit 2 issues where kernel side fixes need to go hand in hand with libv4l updates to not cause regressions. First lets discuss the 2 cases: 1) The pac207 driver currently limits the framerate (and thus the minimum exposure time) because at higher framerate the cam starts using a higher compression and we could not decompress this. Thanks to Bertrik Sikken we can now handle the higher decompression. So no I really want to enable the higher framerates as those are needed to make the cam work properly in full daylight. But if I do this, things will regress for people with an older libv4l, as that won't be able to decompress the frames 2) Several zc3xxx cams have a timing issue between the bridge and the sensor (the windows drivers have the same issue) which makes them do only 320x236 instead of 320x240. Currently we report their resolution to userspace as 320x240, leading to a bar of noise at the bottom of the screen. The fix here obviously is to report the real effective resoltion to userspace, but this will cause regressions for apps which blindly assume 320x240 is available (such as skype). The latest libv4l will make the apps happy again by giving them 320x240 by adding small black borders. Now I see 2 solutions here: a) Just make the changes, seen from the kernel side these are most certainly bugfixes. I tend towards this for case 2) b) Come up with an API to tell the libv4l version to the kernel and make these changes in the drivers conditional on the libv4l version Solution b) sounds messy and will probably lead to a lot of error prone glue code in the kernel. Fast-forward a couple of libv4l releases and you will have a nightmare maintainability scenario. If people run an old libv4l with a new kernel and run into problem, just tell them to upgrade their libv4l version. (b) seems a very bad hack, IMO. Between the two, I choose (a). Ok, So (a) it is then, I'll do a libv4l-0.6.0 release today. And put the changes depend upon libv4l-0.6.0 in my tree then as time permits, they will then go into 2.6.32 eventually which should put enough time between the libv4l release and the kernel release for most people to have the newer libv4l. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
libv4l release: 0.6.0: the upside down release
Hi All, I'm very happy to announce the first release of the next stable series: libv4l-0.6.0 This release features the following familiar features from previous 0.5.9x test releases: * Software whitebalancing * Software automatic gain and exposure for cams which lack this in hardware * Software gamma control * Fake v4l2 controls to control all these * Software flipping controls And as a new feature it now has an extended list of laptops whose camera modules (mostly uvc) are known to be mounted upside down in the frame and it will automatically correct the image for this. And ofcourse the standard addition of support for a few new camera output formats. libv4l-0.6.0 - * Recognize disabled controls and replace with fake equivalents where available * Add support for decompressing ov511 and ov518 JPEG, by piping data through an external helper as I've failed to contact Mark W. McClelland to get permission to relicense the code. If you know a working email address for Mark W. McClelland, please let me know. * Add tons of laptop models to the upside down devices table * Support for rgb565 source format by Mauro Carvalho Chehab * Many bug fixes (see the mercurial tree for details) * Improved pac207 decompression code to also support higher compression modes of the pac207, which enables us to use higher framerates. Many many thanks to Bertrik Sikken for figuring the decompression out! Get it here: http://people.atrpms.net/~hdegoede/libv4l-0.6.0.tar.gz Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
RFC: howto handle driver changes which require libv4l x.y ?
Hi All, So recently I've hit 2 issues where kernel side fixes need to go hand in hand with libv4l updates to not cause regressions. First lets discuss the 2 cases: 1) The pac207 driver currently limits the framerate (and thus the minimum exposure time) because at higher framerate the cam starts using a higher compression and we could not decompress this. Thanks to Bertrik Sikken we can now handle the higher decompression. So no I really want to enable the higher framerates as those are needed to make the cam work properly in full daylight. But if I do this, things will regress for people with an older libv4l, as that won't be able to decompress the frames 2) Several zc3xxx cams have a timing issue between the bridge and the sensor (the windows drivers have the same issue) which makes them do only 320x236 instead of 320x240. Currently we report their resolution to userspace as 320x240, leading to a bar of noise at the bottom of the screen. The fix here obviously is to report the real effective resoltion to userspace, but this will cause regressions for apps which blindly assume 320x240 is available (such as skype). The latest libv4l will make the apps happy again by giving them 320x240 by adding small black borders. Now I see 2 solutions here: a) Just make the changes, seen from the kernel side these are most certainly bugfixes. I tend towards this for case 2) b) Come up with an API to tell the libv4l version to the kernel and make these changes in the drivers conditional on the libv4l version So this is my dilemma, your input is greatly appreciated. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH] libv4l: add support for RGB565 format
Hi, Thanks for the patch I've applied it to my tree, and it will be in the to be released soon libv4l-0.6.0 release. Regards, Hans On 07/03/2009 04:19 AM, Mauro Carvalho Chehab wrote: Currently, em28xx driver outputs webcams only at RGB565 format. However, several webcam applications don't support this format. In order to properly work with those applications, a RGB565 handler should be added at libv4l. Tested with Silvercrest 1.3 mpix with v4l2grab (V4L2, with native libv4l support) and two LD_PRELOAD applications: camorama (V4L1 API) and skype (using compat32). Signed-off-by: Mauro Carvalho Chehabmche...@redhat.com diff --git a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h --- a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h +++ b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h @@ -184,6 +184,15 @@ void v4lconvert_swap_rgb(const unsigned void v4lconvert_swap_uv(const unsigned char *src, unsigned char *dst, const struct v4l2_format *src_fmt); +void v4lconvert_rgb565_to_rgb24(const unsigned char *src, unsigned char *dest, + int width, int height); + +void v4lconvert_rgb565_to_bgr24(const unsigned char *src, unsigned char *dest, + int width, int height); + +void v4lconvert_rgb565_to_yuv420(const unsigned char *src, unsigned char *dest, + const struct v4l2_format *src_fmt, int yvu); + void v4lconvert_spca501_to_yuv420(const unsigned char *src, unsigned char *dst, int width, int height, int yvu); diff --git a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c --- a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c +++ b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c @@ -46,6 +46,7 @@ static const struct v4lconvert_pixfmt su { V4L2_PIX_FMT_YUYV, 0 }, { V4L2_PIX_FMT_YVYU, 0 }, { V4L2_PIX_FMT_UYVY, 0 }, + { V4L2_PIX_FMT_RGB565, 0 }, { V4L2_PIX_FMT_SN9C20X_I420, V4LCONVERT_NEEDS_CONVERSION }, { V4L2_PIX_FMT_SBGGR8, V4LCONVERT_NEEDS_CONVERSION }, { V4L2_PIX_FMT_SGBRG8, V4LCONVERT_NEEDS_CONVERSION }, @@ -787,6 +788,23 @@ static int v4lconvert_convert_pixfmt(str } break; +case V4L2_PIX_FMT_RGB565: + switch (dest_pix_fmt) { + case V4L2_PIX_FMT_RGB24: + v4lconvert_rgb565_to_rgb24(src, dest, width, height); + break; + case V4L2_PIX_FMT_BGR24: + v4lconvert_rgb565_to_bgr24(src, dest, width, height); + break; + case V4L2_PIX_FMT_YUV420: + v4lconvert_rgb565_to_yuv420(src, dest, fmt, 0); + break; + case V4L2_PIX_FMT_YVU420: + v4lconvert_rgb565_to_yuv420(src, dest, fmt, 1); + break; + } + break; + case V4L2_PIX_FMT_RGB24: switch (dest_pix_fmt) { case V4L2_PIX_FMT_BGR24: diff --git a/v4l2-apps/libv4l/libv4lconvert/rgbyuv.c b/v4l2-apps/libv4l/libv4lconvert/rgbyuv.c --- a/v4l2-apps/libv4l/libv4lconvert/rgbyuv.c +++ b/v4l2-apps/libv4l/libv4lconvert/rgbyuv.c @@ -1,8 +1,10 @@ /* # RGB- YUV conversion routines +# (C) 2008 Hans de Goedej.w.r.dego...@hhs.nl -# (C) 2008 Hans de Goedej.w.r.dego...@hhs.nl +# RGB565 conversion routines +# (C) 2009 Mauro Carvalho Chehabmche...@redhat.com # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU Lesser General Public License as published by @@ -472,3 +474,103 @@ void v4lconvert_swap_uv(const unsigned c src += src_fmt-fmt.pix.bytesperline / 2; } } + +void v4lconvert_rgb565_to_rgb24(const unsigned char *src, unsigned char *dest, + int width, int height) +{ + int j; + while (--height= 0) { +for (j = 0; j width; j++) { + unsigned short tmp = *(unsigned short *)src; + + /* Original format: rggg gggb */ + *dest++ = 0xf8 (tmp 8); + *dest++ = 0xfc (tmp 3); + *dest++ = 0xf8 (tmp 3); + + src += 2; +} + } +} + +void v4lconvert_rgb565_to_bgr24(const unsigned char *src, unsigned char *dest, + int width, int height) +{ + int j; + while (--height= 0) { +for (j = 0; j width; j++) { + unsigned short tmp = *(unsigned short *)src; + + /* Original format: rggg gggb */ + *dest++ = 0xf8 (tmp 3); + *dest++ = 0xfc (tmp 3); + *dest++ = 0xf8 (tmp 8); + + src += 2; +} + } +} + +void v4lconvert_rgb565_to_yuv420(const unsigned char *src, unsigned char *dest, + const struct v4l2_format *src_fmt, int yvu) +{ + int x, y; + unsigned short tmp; + unsigned char *udest, *vdest; + unsigned r[4], g[4], b[4]; + int avg_src[3]; + + /* Y */ + for (y = 0; y src_fmt-fmt.pix.height; y++) { +for (x = 0; x src_fmt-fmt.pix.width; x++) { + tmp = *(unsigned short *)src; + r[0] = 0xf8 (tmp 3); + g[0] = 0xfc (tmp 3); + b[0] = 0xf8 (tmp 8); + RGB2Y(r[0], g[0], b[0], *dest++); + src += 2; +} +src += src_fmt-fmt.pix.bytesperline - 2 *
Re: [Fwd: TV kaart donatie]
Hoi, Stefan weet je nog welke driver die kaart gebruikt en / of kan je hem in een machine stoppen en lspci doen, als ik die gegevens heb dan zal ik een mailtje naar de linux-media list sturen om te kijken of een v4l contributor interesse heeft. Misschien is het makkelijker als je zelf direct de mail stuurt: linux-media@vger.kernel.org, is open voor non subscribers. MvG, Hans On 07/02/2009 09:31 AM, S.A. Hartsuiker wrote: Hoi Hans, On 07/01/2009 09:57 PM, Hans Verkuil wrote: On Monday 29 June 2009 10:33:27 Hans de Goede wrote: Hoi Hans, Ik ben van het weekend (linuxtag Berlijn) een Fedora contributer tegen gekomen die een TV-kaart heeft die niet werkt onder Linux, de bridge is wel al indersteund dus waarschijnlijk een kwestie van een board definitie toevoegen, als ik het goed onthouden heb dan heeft de kaart (is een PCI-E kaart) een cx23885 bridge. De eigenaar van de kaart wil deze graag doneren aan een v4l developer zodat deze ondersteund kan worden, is dit iets voor jou? Ik ben niet echt een cx88-driver expert, en zeker geen DVB expert. Vraag dit even op de linux-media mailinglist, er zal vast wel iemand zijn die die kaart wil hebben. Kijk wel eerst even goed wat voor kaart het nu is. PCI-E is iets heel anders dan een ExpressCard. Inderdaad. Het is een Expresscard (qua formaat een pcmcia slot, maar dan smaller, in dit geval 34mm). Ik heb hem nu voor me en op de kaart staat dat het een 'AVerMedia AVerTV Hybrid Express' is. Het claimt een DVB-T te zijn. De doos is compleet, inclusief alle kabels, handleiding en originele cdrom met windows drivers. Mvgr, Stefan Groeten, Hans En zo ja, kan die hem dan direct naar een Nederlands adres van jou sturen, of is het beter als die hem naar mij stuurt en ik hem maandag 13 juli als we elkaar zien aan jouw geef? Zo nee, wie kunnen we er dan blij mee maken? MvG, Hans Original Message Subject: TV kaart donatie Date: Sat, 27 Jun 2009 12:49:37 +0200 From: S.A. Hartsuikerba...@fedoraproject.org To: hdego...@redhat.com Hoi, als je mij een shipping address geeft dan zal ik je die tv kaart toesturen waar we het tijdens LinuxNacht over gehad hebben. Off the top of my head is het een Avermedia Hybrid Expresscard, al zijn daar meerdere, en verschillende, van. Mvgr, Stefan Hartsuiker -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: (very) wrong picture with sonixj driver and 0c45:6128
On 06/30/2009 12:46 PM, Jean-Francois Moine wrote: On Mon, 29 Jun 2009 20:43:29 +0200 (CEST) eric.patur...@orange.fr wrote: i am trying to use an ngs skull webcam with the gspca sonixj driver . i enclose a screen copy , so one can see what what i mean : the image is flatten vertically , there is 25% missing on the left . and the color is all wrong , over-bright . (no matter how much i try to correct with v4l_ctl) the tests have been done with the latest mercurial version of the v4l drivers (from sunday evening) on 2.6.29.4 . I also tried it on 2 other computers (2.6.28.2 ) and 2.6.27.4 . with sames results . [snip] any idea what is going on ? I can provide more detailled log if needed , by setting the debug param in gspca_main Hello Eric, Looking at the ms-win driver, it seems that the bridge is not the right one. May you try to change it? This is done in the mercurial tree editing the file: linux/drivers/media/video/gspca/sonixj.c and replacing the line 2379 from: {USB_DEVICE(0x0c45, 0x6128), BSI(SN9C110, OM6802, 0x21)}, /*sn9c325?*/ to {USB_DEVICE(0x0c45, 0x6128), BSI(SN9C120, OM6802, 0x21)}, /*sn9c325*/ ~~~ Don't forget to do 'make', 'make install' and 'rmmod gspca_sonixj'... Hi, I happen to own a cam with the same USB id myself, and it shows the same issues as described by Eric. Changing the bridge id does not help I'm afraid. I'm afraid I don't have the time to fix this in the near future (it is on my to do but no idea when I'll get around to it). But I'm more then willing to test any fixes. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi, On 06/22/2009 04:29 AM, Mauro Carvalho Chehab wrote: snip luvcview is a somewhat limited app, which only works with uvc cams, even libv4l cannot help it, as it requests uvc specific formats to which libv4l cannot convert. What application works better with libv4l? Btw, it would be nice to have some apps linked with libv4l at epel repository ;) I personally do most of my testing with cheese, I also used a (patched, bugfixed) camorama in combination with libv4l1 for v4l1 compat, as camorama displays fps, which can be usefull when doing driver development. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: PULL request - http://linuxtv.org/hg/~hgoede/gspca
On 06/21/2009 02:39 AM, Mauro Carvalho Chehab wrote: Em Sat, 20 Jun 2009 15:26:25 +0200 Hans de Goedehdego...@redhat.com escreveu: On 06/20/2009 12:51 PM, Mauro Carvalho Chehab wrote: Em Wed, 17 Jun 2009 23:54:40 +0200 Hans de Goedehdego...@redhat.com escreveu: Support for the st6422 bridge + sensor ! Give it a try, I know now you have a cam which uses this bridge :) When you try it be sure to use the latest (just updated my libv4l tree) libv4l, this enables (software) automatic control of the gain and exposure, for a decent image in most lighting conditions. Didn't work :( See the logs bellow. snip $ dmesg STV06xx: Probing for a stv06xx device gspca: probing 046d:08f6 usbcore: registered new interface driver STV06xx STV06xx: registered gspca: usb_submit_urb [0] err -28 gspca: no transfer endpoint found err -28 is ENOSPC which is given by usb_submit_urb, when the required bandwidth for the isoc transfer is not available. With most cams we then automatically fall back to an altsetting which requires less bandwidth, but the st6422 has only one hence the: gspca: no transfer endpoint found error. There are 3 possible causes for this: 1) You are using the device through an usb 2.0 hub, this should work but does not work due to a bug in the usb subsystem of the kernel, which I have reported but most likely won't be fixed OK. On a direct port: OK, better :) mplayer does not recognize /dev/video0 as a v4l url, so it will just try to use plain open and then read, which is causing the errors, as read() on a v4l device wants a buffer large enough to hold 1 frame in one read. Regards, Hans [mche...@pedra ~]$ mplayer /dev/video0 MPlayer dev-SVN-r27514-4.1.2 (C) 2000-2008 MPlayer Team CPU: Intel(R) Xeon(R) CPU E5420 @ 2.50GHz (Family: 6, Model: 23, Stepping: 6) CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 1 Compiled with runtime CPU detection. Playing /dev/video0. libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small libv4l2: error converting / decoding frame data: v4l-convert: error destination buffer too small Exiting... (End of file) Cheers, Mauro -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org
Re: PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi, On 06/21/2009 02:39 AM, Mauro Carvalho Chehab wrote: Em Sat, 20 Jun 2009 15:26:25 +0200 Hans de Goedehdego...@redhat.com escreveu: err -28 is ENOSPC which is given by usb_submit_urb, when the required bandwidth for the isoc transfer is not available. With most cams we then automatically fall back to an altsetting which requires less bandwidth, but the st6422 has only one hence the: gspca: no transfer endpoint found error. There are 3 possible causes for this: 1) You are using the device through an usb 2.0 hub, this should work but does not work due to a bug in the usb subsystem of the kernel, which I have reported but most likely won't be fixed Hi, This morning I had a bit of inspiration, the stv6422 has a register to which the current isoc packet size should be written, so I though, hmm, maybe the whole one altsetting which requests max bandwidth thingie is a bit bogus, and instead it has a variable (so not fixed by altsetting) packet size, and indeed it has. Attached is a patch which: 1) makes it work through an usb 2.0 hub (work around the cannot alloc max isoc bandwidth through a usb 2.0 hub bug, by falling back in speed). 2) makes the mic and video work at the same time Unfortunately 1 + 2 combined do not work, this is clearly a bug of the usb subsystem :( Regards, Hans p.s. Jean-Francois Moine, can you please take a look at this patch and provide feedback? It also makes changes the gscpa core. diff -r 2899ad868fc6 linux/drivers/media/video/gspca/gspca.c --- a/linux/drivers/media/video/gspca/gspca.c Thu Jun 18 19:31:36 2009 +0200 +++ b/linux/drivers/media/video/gspca/gspca.c Sun Jun 21 10:14:28 2009 +0200 @@ -525,6 +525,9 @@ /* See paragraph 5.9 / table 5-11 of the usb 2.0 spec. */ psize = (psize 0x07ff) * (1 + ((psize 11) 3)); + /* Variable packet size overriding alt setting ? */ + if (gspca_dev-isoc_pkt_sz) + psize = gspca_dev-isoc_pkt_sz; npkt = gspca_dev-cam.npkt; if (npkt == 0) npkt = 32; /* default value */ @@ -595,6 +598,7 @@ */ static int gspca_init_transfer(struct gspca_dev *gspca_dev) { + struct cam *cam = gspca_dev-cam; struct usb_host_endpoint *ep; int n, ret; @@ -609,9 +613,9 @@ /* set the higher alternate setting and * loop until urb submit succeeds */ gspca_dev-alt = gspca_dev-nbalt; + gspca_dev-isoc_pkt_sz = cam-max_isoc_pkt_sz; + ep = get_ep(gspca_dev); for (;;) { - PDEBUG(D_STREAM, init transfer alt %d, gspca_dev-alt); - ep = get_ep(gspca_dev); if (ep == NULL) { ret = -EIO; goto out; @@ -648,7 +652,17 @@ if (ret == -ENOSPC) { msleep(20); /* wait for kill * complete */ - break; /* try the previous alt */ + if (gspca_dev-isoc_pkt_sz) { + /* Try smaller packet size */ + gspca_dev-isoc_pkt_sz -= 100; + if (gspca_dev-isoc_pkt_sz + cam-min_isoc_pkt_sz) + goto out; + } else { + /* try the previous alt */ + ep = get_ep(gspca_dev); + } + break; } goto out; } diff -r 2899ad868fc6 linux/drivers/media/video/gspca/gspca.h --- a/linux/drivers/media/video/gspca/gspca.h Thu Jun 18 19:31:36 2009 +0200 +++ b/linux/drivers/media/video/gspca/gspca.h Sun Jun 21 10:14:28 2009 +0200 @@ -57,6 +57,12 @@ u8 bulk;/* image transfer by 0:isoc / 1:bulk */ u8 npkt;/* number of packets in an ISOC message * 0 is the default value: 32 packets */ + /* min / max isoc packet size for camera's which have a variable + packet size, when this is the case BOTH must be set to a non zero + value, the wMaxPacketSize of the alsetting will be ignored and the + highest alt setting will be used */ + int min_isoc_pkt_sz; + int max_isoc_pkt_sz; u32 input_flags;/* value for ENUM_INPUT status flags */ }; @@ -135,6 +141,7 @@ #define USB_BUF_SZ 64 __u8 *usb_buf; /* buffer for USB exchanges */ struct urb *urb[MAX_NURBS]; + int isoc_pkt_sz;
Re: PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi, On 06/21/2009 02:10 PM, Mauro Carvalho Chehab wrote: Em Sun, 21 Jun 2009 08:45:03 +0200 Hans de Goedehdego...@redhat.com escreveu: On 06/21/2009 02:39 AM, Mauro Carvalho Chehab wrote: Em Sat, 20 Jun 2009 15:26:25 +0200 Hans de Goedehdego...@redhat.com escreveu: On 06/20/2009 12:51 PM, Mauro Carvalho Chehab wrote: Em Wed, 17 Jun 2009 23:54:40 +0200 Hans de Goedehdego...@redhat.comescreveu: Support for the st6422 bridge + sensor ! Give it a try, I know now you have a cam which uses this bridge :) When you try it be sure to use the latest (just updated my libv4l tree) libv4l, this enables (software) automatic control of the gain and exposure, for a decent image in most lighting conditions. Didn't work :( See the logs bellow. snip $ dmesg STV06xx: Probing for a stv06xx device gspca: probing 046d:08f6 usbcore: registered new interface driver STV06xx STV06xx: registered gspca: usb_submit_urb [0] err -28 gspca: no transfer endpoint found err -28 is ENOSPC which is given by usb_submit_urb, when the required bandwidth for the isoc transfer is not available. With most cams we then automatically fall back to an altsetting which requires less bandwidth, but the st6422 has only one hence the: gspca: no transfer endpoint found error. There are 3 possible causes for this: 1) You are using the device through an usb 2.0 hub, this should work but does not work due to a bug in the usb subsystem of the kernel, which I have reported but most likely won't be fixed OK. On a direct port: OK, better :) mplayer does not recognize /dev/video0 as a v4l url, so it will just try to use plain open and then read, which is causing the errors, as read() on a v4l device wants a buffer large enough to hold 1 frame in one read. Yes, I know, but the v4l1 and v4l2 calls also fails: $ export LD_PRELOAD=/usr/local/lib/libv4l/v4l1compat.so $ mplayer -tv driver=v4l2 tv:// ... Cannot find codec matching selected -vo and video format 0x32315659. Read DOCS/HTML/en/codecs.html! $ mplayer -tv driver=v4l tv:// The 'outfmt' of 'Planar YV12' is likely not supported by your card Hmm, This works fine for me, perhaps you are using an older libv4l, mplayer did not work with some libv4l versions back as it wants YV12 and some versions back libv4l only did YU12. When trying a new libv4l, make sure you don't have an older version elsewhere, I notice that your LD_PRELOAD points to /usr/local, notice that that will only specify where the compat wrapper gets loaded from (which hasn't changed in ages) the compat_wrapper itself is dynamically linked, so if you have an older libv4l in /usr/lib, chances are that one actually gets used. I tested also luvcview 0.2.5 without success: ERROR: Requested frame format MJPG is not available and no fallback format was found. luvcview is a somewhat limited app, which only works with uvc cams, even libv4l cannot help it, as it requests uvc specific formats to which libv4l cannot convert. Hmm... on a moment of inspiration, I tested it with ekiga, and it properly worked! Great, that supports my using older libv4l theory :) Now, I just want to find a way for it to work with the applications I use ;) See above. I'll try to test later the patches you've sent me for the hub Thanks. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi Mauro, As requested: On 06/18/2009 12:44 PM, Mauro Carvalho Chehab wrote: Also, checkpatch is warning about a few troubles at the patches. Could you please create another tree, directly based on mine, fix the coding styles and send another pull request? I've rebased my tree on your latest and fixed the coding style issues, so please pull from: http://linuxtv.org/hg/~hgoede/gspca For: -ov511(+) support -st6422 support -ov519/ov518 fixes -sonixj 0c45:613e support -sonixj fixes -mark v4l1 ov511 driver deprecated -mark v4l1 quickcam_messenger driver deprecated Thanks, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi Mauro, Can you please pull from: http://linuxtv.org/hg/~hgoede/gspca I've asked JF Moine a couple of days ago if he wanted this to go through his tree or directly, but have not received an answer, as there is one important bugfix in this tree I'm now asking you to pull this directly. For the following: 1) Fix a NULL pointer dereference introduced by changes from your recent pull from JF Moine's tree 2) Add support for ov511(+) and ov518(+) cams 3) Various bugfixes for ov519 based cams Regards, Hans p.s. Given that I'm currently doing quite a bit of gspca work I think its best for the future if I just send pull requests to you directly is that ok ? -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Convert cpia driver to v4l2, drop parallel port version support?
Hi, I recently have been bying second hand usb webcams left and right one of them (a creative unknown model) uses the cpia1 chipset, and works with the v4l1 driver currently in the kernel. One of these days I would like to convert it to a v4l2 driver using gspca as basis, this however will cause us to use parallel port support (that or we need to keep the old code around for the parallel port version). I personally think that loosing support for the parallel port version is ok given that the parallel port itslef is rapidly disappearing, what do you think ? Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Convert cpia driver to v4l2, drop parallel port version support?
On 06/17/2009 09:43 AM, Hans Verkuil wrote: Hi, I recently have been bying second hand usb webcams left and right one of them (a creative unknown model) uses the cpia1 chipset, and works with the v4l1 driver currently in the kernel. One of these days I would like to convert it to a v4l2 driver using gspca as basis, this however will cause us to loose parallel port support (that or we need to keep the old code around for the parallel port version). I personally think that loosing support for the parallel port version is ok given that the parallel port itself is rapidly disappearing, what do you think ? I agree wholeheartedly. If we remove pp support, then we can also remove the bw-qcam and c-qcam drivers since they too use the parallel port. Ok :) BTW, I also have a cpia1 camera available for testing. I can also test ov511 (I saw that you added support for that to gspca). Ditto for the stv680 and w9968cf. Note that I can mail these devices to you if you want to work on integrating them into gspca. I'm pretty sure I won't have time for that myself. Yes I want to work on integrating them into gspca (as time permits). If you could mail them to me that would be great! Esp the w9968cf one, once that is moved to gspca, we can get rid of the entire ovcamchip stuff (eventually it would be good to move to a model where the sensor drivers are seperated again, but I'm waiting to see what comes out of the soc / ov7660 discussion for this). I'll send my postal address in a private mail. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Convert cpia driver to v4l2, drop parallel port version support?
On 06/17/2009 11:56 AM, Mauro Carvalho Chehab wrote: Em Wed, 17 Jun 2009 09:43:50 +0200 (CEST) Hans Verkuilhverk...@xs4all.nl escreveu: I recently have been bying second hand usb webcams left and right one of them (a creative unknown model) uses the cpia1 chipset, and works with the v4l1 driver currently in the kernel. One of these days I would like to convert it to a v4l2 driver using gspca as basis, this however will cause us to use parallel port support (that or we need to keep the old code around for the parallel port version). I personally think that loosing support for the parallel port version is ok given that the parallel port itslef is rapidly disappearing, what do you think ? I agree wholeheartedly. If we remove pp support, then we can also remove the bw-qcam and c-qcam drivers since they too use the parallel port. Maybe I'm too nostalgic, but those are the first V4L drivers. It would be fun to keep supporting them with V4L2 API ;) That's said, while it is probably not that hard to develop a gspca-pp driver, I'm not against removing parallel port support or even removing those drivers due to technical reasons, like the end of V4L1 drivers. By looking at the remaining V4L1 drivers, we have: ov511 - already implemented with V4L2 on gspca. Can be easily removed; Yip, this one is a done deal :) se401, stv680, usbvideo, vicam - USB V4L1 drivers. IMO, it is valuable to convert them to gspca; I've recently bought a (second hand) stv680 cam, haven't seriously tested it yet, so I dunno if it works with the v4l1 driver. I agree it would be good to convert these, and I can work on this as time permits, but I won't be converting code I don't have HW to test for. As for usbvideo that supports (amongst others) the st6422 (from the out of tree qc-usb-messenger driver), but only one usb-id ??. I'm currently finishing up adding st6422 support to gspca (with all known usb-id's), I have 2 different cams to test this with. And indeed as mentioned in another mail we should also convert the w9968cf. cpia2, pwc - supports both V4L1 and V4L2 API. It shouldn't be hard to convert them to vidio_ioctl2 and remove V4L1 API. I have a pwc cam , so I can test changes for pwc. btw current pwc oopses rather badly (GPF) when unplugging the cam, I'll dig into this. stradis - a saa7146 V4L1 only driver - I never understood this one well, since there is already another saa7146 driver running V4L2, used by mxb, hexium_gemini and hexium_orion. To make things worse, stradis, mxb and hexium_orion are registering for the same PCI device (the generic saa7146 PCI ID). If nobody volunteers to convert and test with V4L2, then maybe we can just remove it. The better conversion would probably be to use the V4L2 support at the saa7146 driver. arv - seems to be a VGA output driver - Only implements 3 ioctls: VIDIOCGCAP and VIDIOCGWIN/VIDIOCSWIN. It shouldn't be hard to convert it to V4L2. I'm not sure if this is still used in practice. bw-qcam, pms, c-qcam, cpia, w9966 - very old drivers that use direct io and/or parport; IMO, after having all USB ID's for se401, stv680, usbvideo and vicam devices supported by a V4L2 driver, we can just remove V4L1 ioctls from cpia2 and pwc, and the drivers that will still remain using only the legacy API can be dropped. Anything more converted will be a bonus Big +1, having digged through many of these old drivers to convert them, they all seem rather crufty and ugly and having them gone would be good. While cleaning cruft, I would also like to see the following v4l2 drivers go away in time, they are all from the same author (who mostly borrowed the reverse engineering work from the original out of tree gspca) and he does not maintain them, and they all support cams also supported by the new gspca: zc0301 only supports one usb-id which has not yet been tested with gspca, used to claim a lot more usb-id's but didn't actually work with those as it only supported the bridge, not the sensor - remove it now ? et61x251 Only supports using this bridge in combination with one type of sensor where as gspca supports 2 type of sensors. gspca support is untested AFAIK. - ? sn9c102 Supports a large number of cams also supported by gspca's sonixb / sonixj driver, we're using #ifdef macros to detect if both are being build at the same time to include usb-id's only in one of the 2. As seems normal for drivers from this author the driver used to claim a lot of usb-id's it couldn't actually work with as it only supported the bridge, not the sensor. We've removed all those and are now slowly moving over the remaining usb-ids to gspca as things get tested with gspca. - Keep on moving over usb-id's then when only a few are left, drop it Regards, Hans -- To
Re: Convert cpia driver to v4l2, drop parallel port version support?
Hi, On 06/17/2009 05:23 PM, Mauro Carvalho Chehab wrote: Em Wed, 17 Jun 2009 16:41:23 +0200 Hans de Goedehdego...@redhat.com escreveu: Hi all, On 06/17/2009 04:28 PM, Mauro Carvalho Chehab wrote: Em Wed, 17 Jun 2009 12:59:59 +0200 Hans de Goedehdego...@redhat.com escreveu: snip As for usbvideo that supports (amongst others) the st6422 (from the out of tree qc-usb-messenger driver), but only one usb-id ??. I'm currently finishing up adding st6422 support to gspca (with all known usb-id's), I have 2 different cams to test this with. I have here one Logitech quickcam. There are several variants, and the in-tree and out-tree drivers support different models. I can test it here and give you a feedback. However, I don't have the original driver for it. Ok, what is its usb id (they tend to be unique for Logitech cams) ? The one I have is this one: Bus 005 Device 003: ID 046d:08f6 Logitech, Inc. QuickCam Messenger Plus This is supported by one quickcam driver. Ah, that is one of the 2 models I have access to, so I can promise you that one will work fine with the new st6422 support I'm doing. zc0301 only supports one usb-id which has not yet been tested with gspca, used to claim a lot more usb-id's but didn't actually work with those as it only supported the bridge, not the sensor - remove it now ? I have one zc0301 cam that works with this driver. The last time I checked, it didn't work with gspca. I'll double check. Ok, let me know how it goes. The zc0301 camera is this one: Bus 005 Device 002: ID 046d:08ae Logitech, Inc. QuickCam for Notebooks zc0301 driver says this: [98174.672464] zc0301: V4L2 driver for ZC0301[P] Image Processor and Control Chip v1:1.10 [98174.672517] usb 5-2: ZC0301[P] Image Processor and Control Chip detected (vid/pid 0x046D:0x08AE) [98174.713717] usb 5-2: PAS202BCB image sensor detected The cam works as expected. Hmm, bummer I don't have any zc3xx test cams with a pas202b sensor, guess I need to find one :) I probably won't go to LPC this year, since I'm programming to be at Japan Linux Symposium in October, and it seems too much jet leg for me to be in Portland in Sept and in Japan in Oct ;) Ah too bad, but understandable. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Convert cpia driver to v4l2, drop parallel port version support?
On 06/17/2009 08:11 PM, Brian Johnson wrote: Hans de Goede wrote: sn9c102 Supports a large number of cams also supported by gspca's sonixb / sonixj driver, we're using #ifdef macros to detect if both are being build at the same time to include usb-id's only in one of the 2. Btw, it would be interesting to work with the out-of-tree microdia driver, since there are some models that are supported only by the alternative driver. Ack, only one small problem, which is another reason why Luca's drivers should slowly be phased out, Luca has gone closed source with his sn9cxxx driver. There is an out of tree driver for the new sn9c2xx models you talk about though, with active developers, I've pushing them to get it into the mainline, I'll give it another try soonish. Hello I'm one of the developers for the current out of tree sn9c20x driver. What needs to be done in order to get the sn9c20x code into the mainline? Am i right in assuming it would be preferred to move the code into a sn9c20x gspca subdriver rather then include the complete out of tree driver? Yes that would be very much prefered. Not that I believe that gspca is the best thing ever invented or anything like that. But usb webcam drivers all have a lot in common and gspca handles that good enough, and if we ever want to make improvements like moving usb webcams to use videobuf, having them all as gspca sub drivers means we only need to do it once, as for example all buffer management is done by gspca. Also after looking at the pwc driver oops at unplug, and being reminded at the ref counting with hotplug devices going away and locking nightmare stuff we discussed some time ago, I'm also really glad to only have all that tricky code only once. This will also make reviewing a lot easier as there will be no tricky buffer management and locking, etc. to review. If this is the case I can work on a set of patches to implement our code as a gspca subdriver. As explained above very much: Yes please Also i have a few questions regarding submitting the patches. 1) In addition to sending them to linux-media should I CC them to anyone in particular? I have such a cam and I'm one of the people actively working on gspca, so if you could CC me then I'm sure to notice it and review it, and it can get merged through my mercurial (git alike vcs) tree. 2) The entire patch would likely be about 70k. Should I just send one patch or split the thing up into several? I would hope gspca would shrink the size somewhat :) As for one patch versus incremental patches, as this is a whole new driver one patch will do I guess, I see little use in having non working increments in between. Thanks Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
PULL request - http://linuxtv.org/hg/~hgoede/gspca
Hi Mauro, Can you please pull from: http://linuxtv.org/hg/~hgoede/gspca I know you haven't even had the chance to do my previous pull request :) New this time: * mark the ov511 driver as deprecated, note: we should really also keep track of this in Documentation/feature-removal-schedule.txt, but that is not part of the v4l-dvb tree. * Support for the st6422 bridge + sensor ! Give it a try, I know now you have a cam which uses this bridge :) When you try it be sure to use the latest (just updated my libv4l tree) libv4l, this enables (software) automatic control of the gain and exposure, for a decent image in most lighting conditions. BTW, the latest libv4l also does this (auto expo / gain) for the spca561 based logitech cam I borrowed to you at plumbers last year, works really nice :) Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: GPL code for Omnivision USB video camera available.
On 06/15/2009 03:01 AM, Erik de Castro Lopo wrote: On Sat, 13 Jun 2009 18:12:10 +1000 Hans de Goedehdego...@redhat.com wrote: Getting ovfx2 support into the mainline kernel sounds like a good idea! I'm not such a big fan of merging the driver as is though, as it does its own buffer management (and ioctl handling, usb interrupt handling, locking, etc). I understand completely. Good! For adding the ovfx2 driver, you could start by copying ov519.c, which already has setup and control code fro most ov sensors and then rewrite the bridge part to be ovfx2 code, then later we can try to move the sensor code to a shared c file for the ov519 and ovfx2 driver, depending on how much you needed to change the sensor code. Or you could add support for the ovfx2 to the ov519 driver. Note I've recently being doing quite a bit of work on the ov519 driver, adding support for the ov511 and ov518 and adding more controls. I'll make a mercurial tree available with my latest code in it asap. Ok, there's the rub. I am simply way too busy at the moment to push this through myself. I was hoping I could contract someone to take the existing code and massage it into shape ready for merging. I would prefer it if that someone was already a V4L hacker, but if I can't find anyone with pre-existing V4L experience I'll find someone local with general Linux kernel/driver experience. Well I can't offer you contracting, as I simply do not have the spare time to make such promises, but as any good hacker: will work for hardware on a I'll do my best but no promises made basis. I'm actually spending quite a bit of time lately on v4l stuff again, and I'm sure willing to spend some time on this. I can even promise you I'll bump it to the top of the list of my v4l projects. For a general idea how deep I'm involved in v4l webcam support see: https://fedoraproject.org/wiki/Features/BetterWebcamSupport Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: GPL code for Omnivision USB video camera available.
Hi Erik, For the latest version of the gspca ov519 driver, with all me recent work for adding ov511 and ov518 support in it see: http://linuxtv.org/hg/~hgoede/gspca Regards, Hans On 06/13/2009 02:45 AM, Erik de Castro Lopo wrote: Hans de Goede wrote: This looks to me like its just ov51x-jpeg made to compile with the latest kernel. Its more than that. This driver supports a number of cameras and the only one we (bCODE) are really interested in is the ovfx2 driver. Did you make any functional changes? I believe the ovfx2 driver is completely new. Also I wonder if you're subscribed to the (low trafic) ov51x-jpeg mailinglist, that seems to be the right thing todo for someone who tries to get that driver in to the mainline. Sorry its the ovfx2 that I'm interested in pushing into the kernel. May I ask what cam you have? I could certainly use more people testing this. It looks like this on the USB bus: Bus 007 Device 002: ID 05a9:2800 OmniVision Technologies, Inc. Cheers, Erik -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: GPL code for Omnivision USB video camera available.
Hi, On 06/13/2009 02:45 AM, Erik de Castro Lopo wrote: Hans de Goede wrote: This looks to me like its just ov51x-jpeg made to compile with the latest kernel. Its more than that. This driver supports a number of cameras and the only one we (bCODE) are really interested in is the ovfx2 driver. Ah, I hadn't noticed the ovfx2 driver, actually I've never heard of it before. Did you make any functional changes? I believe the ovfx2 driver is completely new. It is atleast to me and I know lots of webcam drivers. Also I wonder if you're subscribed to the (low trafic) ov51x-jpeg mailinglist, that seems to be the right thing todo for someone who tries to get that driver in to the mainline. Sorry its the ovfx2 that I'm interested in pushing into the kernel. Ok, Getting ovfx2 support into the mainline kernel sounds like a good idea! I'm not such a big fan of merging the driver as is though, as it does its own buffer management (and ioctl handling, usb interrupt handling, locking, etc). Now a days we prefer for new drivers to use existing infrastructure. Preferably the ovfx2 driver would be re-written to use the gspca usb webcam driver framework. See for example the ov519 driver: drivers/media/video/gspca/ov519.c Which is also based on ov51x-jpeg. There are also several bulk mode using drivers under drivers/media/video/gspca As you will see when you look here, gspca sub drivers currently do not have the sensor code separated from the bridge :( There are several reasons for this, the biggest one being that most drivers are reverse engineered and we simply do not know enough about what all the sensor registers do, to cleanly separate bridge and sensor code. Another reason is that quite a few sensor settings can be quite bridge specific, for example vsync / hsync timing which seems quite sane may not work with some bridges, because the require some strange timings, another example is that registers which are meant to adjust the framerate to match the powernet frequency, are sometimes abused to correct for a somewhat strange clock being offer to the sensor by the bridge, etc. So doing the bridge sensor separation cleanly is far from easy, and for now we've given up doing this for webcams, esp. as just getting webcams to work and working properly has a higher priority atm. Once we have most cams working we can better analyze which sensor settings are the same for all bridges and which are bridge specific and find a why to separate the sensor and bridge code. For adding the ovfx2 driver, you could start by copying ov519.c, which already has setup and control code fro most ov sensors and then rewrite the bridge part to be ovfx2 code, then later we can try to move the sensor code to a shared c file for the ov519 and ovfx2 driver, depending on how much you needed to change the sensor code. Or you could add support for the ovfx2 to the ov519 driver. Note I've recently being doing quite a bit of work on the ov519 driver, adding support for the ov511 and ov518 and adding more controls. I'll make a mercurial tree available with my latest code in it asap. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: GPL code for Omnivision USB video camera available.
On 06/12/2009 03:02 AM, Erik de Castro Lopo wrote: Hi all, I have a driver for a USB video camera that I'd like to see added to the mainline kernel, mainly so I don't have to fix breakage due to constant changes in the kernel :-). The code is GPL and is available here: http://stage.bcode.com/erikd/ovcamchip and the history of this code is here: http://stage.bcode.com/erikd/ovcamchip/README My problem is that I am way too busy to sheperd this into the kernel myself. If someone is willing to work on getting this in, I can send them a camera to keep. If getting paid is more likely to help someone focus on the task then that is also a possibility. Any takers? Please email me privately. This looks to me like its just ov51x-jpeg made to compile with the latest kernel. Did you make any functional changes? Note that ov51x-jpeg is not acceptable as is as it does in kernel decompression of the ov51x jpeg-like compression format. I'm currently finishing up adding ov511(+) and ov518(+) support to the gspca ov519 subdriver. And I've already added support for decompressing their format in userspace to libv4l. Also I wonder if you're subscribed to the (low trafic) ov51x-jpeg mailinglist, that seems to be the right thing todo for someone who tries to get that driver in to the mainline. I've already announced my work on getting the ov511(+) and ov518(+) supported properly in the mainline kernel there. May I ask what cam you have? I could certainly use more people testing this. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: mt9t031 (was RE: [PATCH] adding support for setting bus parameters in sub device)
On 06/11/2009 11:33 AM, Hans Verkuil wrote: On 06/11/2009 10:35 AM, Hans Verkuil wrote: snip (a lot) Hmm, Why would we want the *application* to set things like this *at all* ? with sensors hsync and vsync and other timing are something between the bridge and the sensor, actaully in my experience the correct hsync / vsync timings to program the sensor to are very much bridge specific. So IMHO this should not be exposed to userspace at all. All userspace should be able to control is the resolution and the framerate. Although controlling the framerate in many cases also means controlling the maximum exposure time. So in many cases one cannot even control the framerate. (Asking for 30 fps in an artificially illuminated room will get you a very dark, useless picture, with most sensors). Yes this means that with cams with use autoexposure (which is something which we really want where ever possible), the framerate can and will change while streaming. I think we have three possible use cases here: - old-style standard definition video: use S_STD Ack - webcam-like devices: a combination of S_FMT and S_PARM I think? Correct me if I'm wrong. S_STD is useless for this, right? Ack - video streaming devices like the davinci videoports where you can hook up HDTV receivers or FPGAs: here you definitely need a new API to setup the streaming parameters, and you want to be able to do that from the application as well. Actually, sensors are also hooked up to these devices in practice. And there you also want to be able to setup these parameters. You will see this mostly (only?) on embedded platforms. I agree we need an in kernel API for this, but why expose it to userspace, as you say this will only happen on embedded systems, shouldn't the info then go in a board_info file / struct ? Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: webcam drivers and V4L2_MEMORY_USERPTR support
Hi all, On 06/08/2009 10:58 AM, Stefan Kost wrote: Hans de Goede schrieb: On 06/05/2009 09:43 AM, Stefan Kost wrote: Hans de Goede schrieb: On 06/01/2009 09:58 AM, Trent Piepho wrote: On Mon, 1 Jun 2009, Stefan Kost wrote: I have implemented support for V4L2_MEMORY_USERPTR buffers in gstreamers v4l2src [1]. This allows to request shared memory buffers from xvideo, capture into those and therefore save a memcpy. This works great with the v4l2 driver on our embedded device. When I was testing this on my desktop, I noticed that almost no driver seems to support it. I tested zc0301 and uvcvideo, but also grepped the kernel driver sources. It seems that gspca might support it, but I ave not confirmed it. Is there a technical reason for it, or is it simply not implemented? userptr support is relatively new and so it has less support, especially with driver that pre-date it. Maybe USB cams use a compressed format and so userptr with xvideo would not work anyway since xv won't support the camera's native format. It certainly could be done for bt8xx, cx88, saa7134, etc. Even in the webcam with custom compressed format case, userptr support could be useful to safe a memcpy, as libv4l currently fakes mmap buffers, so what happens is: camdirect transfer mmap bufferlibv4l format conversion fake mmap buffer application-memcpy dest buffer So if libv4l would support userptr's (which it currently does not do) we could still safe a memcpy here. Do you mean that if a driver supports userptr and one uses libv4l instead of the direct ioctl, there is a regression and the app iuppo getting told only mmap works? Yes, this was done this way for simplicity's sake (libv4l2 is complex enough at is). At the time this decision was made it was an easy one to make as userptr support mostly was (and I believe still is) a paper excercise. Iow no applications and almost no drivers support it. If more applications start supporting it, support can and should be added to libv4l2. But this will be tricky. E.g. omap2 v4l2 drivers (e.g. used in Nokia N800/N810) support it and the new drivers fro omap3 will do the same. I probably need to revert the libv4l usage in gstreamer than as we can have regressions in applications ... Erm the current (0.10.15) gstreamer libv4l2 plugin does not even use USERPTR support (which confirms my I didn't implement it because nothing uses it reasoning), so there can be no regression. Now not using libv4l will make gstreamer applications not work with *hundreds* of different webcam models (and that is not an exageration), see: http://moinejf.free.fr/webcam.html For an incomplete list, now some there may work without libv4l, but most don't. So given as a choice: * not having a performance enhancement, which was never present before so no regression * not working with *hundreds* of different webcam models Which one are you going to choose ? I will most certainly know where to redirect the bug reports. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH] V4L/pwc - use usb_interface as parent, not usb_device
Looks good, we recently fixed the same issue in the gspca driver to, Acked-by: Hans de Goede hdego...@redhat.com On 06/04/2009 09:18 PM, Lennart Poettering wrote: The current code creates a sysfs device path where the video4linux device is child of the usb device itself instead of the interface it belongs to. That is evil and confuses udev. This patch does basically the same thing as Kay's similar patch for the ov511 driver: http://git.kernel.org/?p=linux/kernel/git/torvalds/linux-2.6.git;a=commitdiff;h=ce96d0a44a4f8d1bb3dc12b5e98cb688c1bc730d (Resent 2nd time, due to missing Signed-off-by) Lennart Signed-off-by: Lennart Poetteringmzxre...@0pointer.de --- drivers/media/video/pwc/pwc-if.c |2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/drivers/media/video/pwc/pwc-if.c b/drivers/media/video/pwc/pwc-if.c index 7c542ca..92d4177 100644 --- a/drivers/media/video/pwc/pwc-if.c +++ b/drivers/media/video/pwc/pwc-if.c @@ -1783,7 +1783,7 @@ static int usb_pwc_probe(struct usb_interface *intf, const struct usb_device_id return -ENOMEM; } memcpy(pdev-vdev,pwc_template, sizeof(pwc_template)); - pdev-vdev-parent =(udev-dev); + pdev-vdev-parent =intf-dev; strcpy(pdev-vdev-name, name); video_set_drvdata(pdev-vdev, pdev); -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: webcam drivers and V4L2_MEMORY_USERPTR support
On 06/05/2009 09:43 AM, Stefan Kost wrote: Hans de Goede schrieb: On 06/01/2009 09:58 AM, Trent Piepho wrote: On Mon, 1 Jun 2009, Stefan Kost wrote: I have implemented support for V4L2_MEMORY_USERPTR buffers in gstreamers v4l2src [1]. This allows to request shared memory buffers from xvideo, capture into those and therefore save a memcpy. This works great with the v4l2 driver on our embedded device. When I was testing this on my desktop, I noticed that almost no driver seems to support it. I tested zc0301 and uvcvideo, but also grepped the kernel driver sources. It seems that gspca might support it, but I ave not confirmed it. Is there a technical reason for it, or is it simply not implemented? userptr support is relatively new and so it has less support, especially with driver that pre-date it. Maybe USB cams use a compressed format and so userptr with xvideo would not work anyway since xv won't support the camera's native format. It certainly could be done for bt8xx, cx88, saa7134, etc. Even in the webcam with custom compressed format case, userptr support could be useful to safe a memcpy, as libv4l currently fakes mmap buffers, so what happens is: camdirect transfer mmap bufferlibv4l format conversion fake mmap buffer application-memcpy dest buffer So if libv4l would support userptr's (which it currently does not do) we could still safe a memcpy here. Do you mean that if a driver supports userptr and one uses libv4l instead of the direct ioctl, there is a regression and the app iuppo getting told only mmap works? Yes, this was done this way for simplicity's sake (libv4l2 is complex enough at is). At the time this decision was made it was an easy one to make as userptr support mostly was (and I believe still is) a paper excercise. Iow no applications and almost no drivers support it. If more applications start supporting it, support can and should be added to libv4l2. But this will be tricky. For higher pixels counts extra memcpy's are scary, especially if they are no visible. Sorry for the naive question, but what is libv4l role regarding buffer allocations? In ourcase we don't need any extra format conversion from libv4l. I am fine if it works without extra memcpy in that case and I understand that it would be tricky to support inplace formats conversions for some formats and extra memcpy for the rest. I would be willing to take *clean, non invasive* patches to libv4l to add userptr support, but I'm not sure if this can be done in a clean way (haven't tried). Where are the libv4l sources hosted. I found your blog and the freshmeat page only so far. The sources are part of the v4l-dvb mercurial tree. But the latest version is in my personal tree, please use that to base patches on: http://linuxtv.org/hg/~hgoede/libv4l Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH] Add missing __devexit_p()
Hi all, On 06/04/2009 04:07 PM, Jean Delvare wrote: Add missing __devexit_p() to several drivers. Also add a few missing __init, __devinit and __exit markers. These errors could result in build failures depending on the kernel configuration. Signed-off-by: Jean Delvarekh...@linux-fr.org Looks good to me. Regards, Hans --- linux/drivers/media/dvb/bt8xx/bt878.c |8 +--- linux/drivers/media/video/cx88/cx88-alsa.c|7 +++ linux/drivers/media/video/mx3_camera.c|6 +++--- linux/drivers/media/video/pxa_camera.c|6 +++--- linux/drivers/media/video/soc_camera.c|2 +- linux/drivers/media/video/usbvision/usbvision-video.c |2 +- linux/drivers/media/video/zoran/zoran_card.c |2 +- 7 files changed, 13 insertions(+), 20 deletions(-) --- v4l-dvb.orig/linux/drivers/media/dvb/bt8xx/bt878.c 2009-03-01 16:09:08.0 +0100 +++ v4l-dvb/linux/drivers/media/dvb/bt8xx/bt878.c 2009-06-04 14:00:41.0 +0200 @@ -512,12 +512,6 @@ static int __devinit bt878_probe(struct pci_set_master(dev); pci_set_drvdata(dev, bt); -/*if(init_bt878(btv) 0) { - bt878_remove(dev); - return -EIO; - } -*/ - if ((result = bt878_mem_alloc(bt))) { printk(KERN_ERR bt878: failed to allocate memory!\n); goto fail2; @@ -583,7 +577,7 @@ static struct pci_driver bt878_pci_drive .name = bt878, .id_table = bt878_pci_tbl, .probe = bt878_probe, - .remove = bt878_remove, + .remove = __devexit_p(bt878_remove), }; static int bt878_pci_driver_registered; --- v4l-dvb.orig/linux/drivers/media/video/cx88/cx88-alsa.c 2009-04-17 11:22:56.0 +0200 +++ v4l-dvb/linux/drivers/media/video/cx88/cx88-alsa.c 2009-06-04 14:04:37.0 +0200 @@ -939,7 +939,7 @@ static struct pci_driver cx88_audio_pci_ .name = cx88_audio, .id_table = cx88_audio_pci_tbl, .probe= cx88_audio_initdev, - .remove = cx88_audio_finidev, + .remove = __devexit_p(cx88_audio_finidev), }; / @@ -949,7 +949,7 @@ static struct pci_driver cx88_audio_pci_ /* * module init */ -static int cx88_audio_init(void) +static int __init cx88_audio_init(void) { printk(KERN_INFO cx2388x alsa driver version %d.%d.%d loaded\n, (CX88_VERSION_CODE 16) 0xff, @@ -965,9 +965,8 @@ static int cx88_audio_init(void) /* * module remove */ -static void cx88_audio_fini(void) +static void __exit cx88_audio_fini(void) { - pci_unregister_driver(cx88_audio_pci_driver); } --- v4l-dvb.orig/linux/drivers/media/video/mx3_camera.c 2009-04-29 14:30:29.0 +0200 +++ v4l-dvb/linux/drivers/media/video/mx3_camera.c 2009-06-04 14:05:25.0 +0200 @@ -1074,7 +1074,7 @@ static struct soc_camera_host_ops mx3_so .set_bus_param = mx3_camera_set_bus_param, }; -static int mx3_camera_probe(struct platform_device *pdev) +static int __devinit mx3_camera_probe(struct platform_device *pdev) { struct mx3_camera_dev *mx3_cam; struct resource *res; @@ -1194,11 +1194,11 @@ static struct platform_driver mx3_camera .name = MX3_CAM_DRV_NAME, }, .probe = mx3_camera_probe, - .remove = __exit_p(mx3_camera_remove), + .remove = __devexit_p(mx3_camera_remove), }; -static int __devinit mx3_camera_init(void) +static int __init mx3_camera_init(void) { return platform_driver_register(mx3_camera_driver); } --- v4l-dvb.orig/linux/drivers/media/video/pxa_camera.c 2009-06-04 13:45:28.0 +0200 +++ v4l-dvb/linux/drivers/media/video/pxa_camera.c 2009-06-04 14:03:05.0 +0200 @@ -1541,7 +1541,7 @@ static struct soc_camera_host_ops pxa_so .set_bus_param = pxa_camera_set_bus_param, }; -static int pxa_camera_probe(struct platform_device *pdev) +static int __devinit pxa_camera_probe(struct platform_device *pdev) { struct pxa_camera_dev *pcdev; struct resource *res; @@ -1716,11 +1716,11 @@ static struct platform_driver pxa_camera .name = PXA_CAM_DRV_NAME, }, .probe = pxa_camera_probe, - .remove = __exit_p(pxa_camera_remove), + .remove = __devexit_p(pxa_camera_remove), }; -static int __devinit pxa_camera_init(void) +static int __init pxa_camera_init(void) { return platform_driver_register(pxa_camera_driver); } --- v4l-dvb.orig/linux/drivers/media/video/soc_camera.c 2009-05-11 11:12:03.0 +0200 +++ v4l-dvb/linux/drivers/media/video/soc_camera.c 2009-06-04 14:04:58.0 +0200 @@ -1206,7 +1206,7 @@ static int __devexit soc_camera_pdrv_rem static struct platform_driver __refdata soc_camera_pdrv = { .probe =
libv4l release: 0.5.99 (The don't crash release)
Hi All, So 0.5.98 had a few nasty bugs, causing black screens and crashes in certain cases. This release should fix all those. libv4l-0.5.99 - * Link libv4lconvert with -lm for powf by Gregor Jasny * Fix black screen on devices with hardware gamma control * Fix crash with devices on which we do not emulate fake controls * Add a patch by Hans Petter Selasky hsela...@freebsd.org, which should lead to allowing use of libv4l (and the Linux webcam drivers ported to userspace usb drivers) on FreeBSD, this is a work in progress Get it here: http://people.atrpms.net/~hdegoede/libv4l-0.5.99.tar.gz Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: webcam drivers and V4L2_MEMORY_USERPTR support
On 06/01/2009 09:58 AM, Trent Piepho wrote: On Mon, 1 Jun 2009, Stefan Kost wrote: I have implemented support for V4L2_MEMORY_USERPTR buffers in gstreamers v4l2src [1]. This allows to request shared memory buffers from xvideo, capture into those and therefore save a memcpy. This works great with the v4l2 driver on our embedded device. When I was testing this on my desktop, I noticed that almost no driver seems to support it. I tested zc0301 and uvcvideo, but also grepped the kernel driver sources. It seems that gspca might support it, but I ave not confirmed it. Is there a technical reason for it, or is it simply not implemented? userptr support is relatively new and so it has less support, especially with driver that pre-date it. Maybe USB cams use a compressed format and so userptr with xvideo would not work anyway since xv won't support the camera's native format. It certainly could be done for bt8xx, cx88, saa7134, etc. Even in the webcam with custom compressed format case, userptr support could be useful to safe a memcpy, as libv4l currently fakes mmap buffers, so what happens is: cam direct transfer mmap buffer libv4l format conversion fake mmap buffer application-memcpy dest buffer So if libv4l would support userptr's (which it currently does not do) we could still safe a memcpy here. I would be willing to take *clean, non invasive* patches to libv4l to add userptr support, but I'm not sure if this can be done in a clean way (haven't tried). An alternative could be for the app to just use read() in the above case as then the app already provides the dest buffer. And the conversion will write directly to the application provided buffer. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Licensing question regarding SN9C2028 decompression (fwd)
On 05/27/2009 11:43 PM, Theodore Kilgore wrote: Hans, Here is the answer which I got about the question of GPL-LGPL licensing in regard to the sn9c2028 decompression code. Hmm, Given that you did have contact with the original author years ago and he also did ok it back then, and that large parts of the code are written by you, I'm ok with moving forward changing the license to LGPL and then committing the patch. Regards, Hans Theodore Kilgore -- Forwarded message -- Date: Wed, 27 May 2009 13:19:46 -0400 From: Harald h...@users.sourceforge.net To: Theodore Kilgore kilg...@banach.math.auburn.edu Subject: Re: Licensing question regarding SN9C2028 decompression Hi Theodore, I give you permission to use the SN9C2028 code with a LGPL license. I am the current maintainer of the macam project. Most of the code that has been added in the last few years is mine. However, I did not originate the SN9C2028 code. I have messed with it a lot, it may not bear much resemblance to the original code. I am sure that whatever code you based your version on has been modified as well. I doubt that you use Objective-C for example... It is likely that technically all of macam should be under LGPL anyway, as it works as a plug-in component to QuickTime. So from an intent perspective, that is how the macam code is used anyway. You should be able to use it the same way. I have never been able to contact the originator (mattik) of the project! I became admin through an intermediate admin (dirkx). We're all three admins, but neither of the others have contributed anything in the last 5 years. I hope this helps, Harald On May 24, 2009, at 13:40, Theodore Kilgore wrote: Harald, Right now I am working on putting streaming support for the SN9C2028 cameras (supported by libgphoto2/camlibs/sonix as still cameras) into the Linux kernel, as part of linux/drivers/media/video/gspca. In doing so, there is a licensing conflict, as follows: The Linux kernel is of course GPL licensed, as we are aware. However, the philosophy of what the kernel is supposed to do with things like video devices is, it takes care of creating a device dev/video and it takes care of basic infrastructural things such as how to talk to the camera, to initialize it, to turn it off, to tell it to stream, and to detect and save packets and to construct frames. The code for things like decompression has been deliberately moved away from the kernel code, and the idea is to put all that stuff into a library called libv4l, which then provides a unified interface for userspace streaming apps. The problem is, the decompression code would need to go into part of libv4l, namely libv4lconvert. And the license for libv4l and everything in it is LGPL, not GPL. As the originator of the decompression function for the Sonix cameras, are you willing to give permission for taking my version of the code from GPL to LGPL? Or can you suggest some other appropriate course of action? Theodore Kilgore -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Licensing question regarding SN9C2028 decompression (fwd)
On 05/28/2009 04:34 PM, Theodore Kilgore wrote: On Thu, 28 May 2009, Hans de Goede wrote: On 05/27/2009 11:43 PM, Theodore Kilgore wrote: Hans, Here is the answer which I got about the question of GPL-LGPL licensing in regard to the sn9c2028 decompression code. Hmm, Given that you did have contact with the original author years ago and he also did ok it back then, and that large parts of the code are written by you, I'm ok with moving forward changing the license to LGPL and then committing the patch. Regards, Hans If you think it is appropriate, I can include the mail as part of the file. I notice that this is done in some other files, for example in pac207.c. Ack, But given that you are one of the authors of the original code in this case, and the mail isn't a 100% clear re-license permission, I think its best to just change the license and be done with it, without including the mail, I think if anything the mail will only lead to confusion. But as far as contact with the author is concerned, it is even more accurate to say that the cooperation was a two-way street. I understand that some of my LGPL code for other camera drivers has been put to use, too, in the macam project. For example, they also have drivers for the SQ cameras and the mr97310a cameras. Clearly, I do not have a problem with that any more than Harald has with my using the sn9c2028 decompression algorithm. In fact, as he called the decompression algorithm to my attention, I brought to his attention the work which I had done on those other cameras. Ack, as said I think relicensing is fine. Sorry I did not keep all the e-mails, though. No problem, Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
libv4l release: 0.5.98: the gamma correction release!
Hi All, This is probably the last test release for the 0.6.x series, the video processing code has been rewritten and works very nicely now. Please give this release a thorough testing! The software whitebalancing and gamma correction can make a very positive difference on the image quality given of by cheaper cams. libv4l now automatically enables fake controls to enable software white-balancing and gamma for most webcams (all those will will need conversion anyways). So when you startup v4l2ucp you should see a checkbox for whitebalance and a slider for gamma (the default setting of 1000 == 1.0 is no gamma correction). Now start your favorite webcam viewing app and play around with the 2 controls. If whitebalancing makes a *strongly noticable* positive difference for your webcam please mail me info about your cam (the usb id), then I can add it to the list of cams which will have the whitebalancing algorithm enabled by default. The same goes for cams which benefit from a significant gamma correction. For example this release sets the gamma to 1500 (1.5) for pac207 cams by default, resulting in a much improved image. libv4l-0.5.98 - * Add software gamma correction * Add software auto gain / exposure * Add support for separate vflipping and hflipping * Add fake controls controlling the software h- and v-flipping * Add ability to determine upside down cams based on DMI info * Add the capability to provide 320x240 to apps if the cam can only do 320x232 (some zc3xx cams) by adding black borders * Rewrite video processing code to make it easier to add more video filters (and with little extra processing cost). As part of this the normalize filter has been removed as it wasn't functioning satisfactory anyways * Support V4L2_CTRL_FLAG_NEXT_CTRL for fake controls by Adam Baker * Some makefile improvements by Gregor Jasny * Various small bugfixes and tweaks * The V4L2_ENABLE_ENUM_FMT_EMULATION v4l2_fd_open flag is obsolete, libv4l2 now *always* reports emulated formats through the ENUM_FMT ioctl Get it here: http://people.atrpms.net/~hdegoede/libv4l-0.5.98.tar.gz Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: gspca: Logitech QuickCam Messenger worked last with external gspcav1-20071224
On 05/26/2009 01:44 PM, Guennadi Liakhovetski wrote: Hi all I think it would be agood time now to get my Logitech QuickCam Messenger camera working with the current gspca driver. It used to work with gspcav1-20071224, here's dmesg output: /tmp/gspcav1-20071224/gspca_core.c: USB GSPCA camera found.(ZC3XX) /tmp/gspcav1-20071224/gspca_core.c: [spca5xx_probe:4275] Camera type JPEG /tmp/gspcav1-20071224/Vimicro/zc3xx.h: [zc3xx_config:669] Find Sensor HV7131R(c) with more USB related messages following. Now dmesg with some debug turned on looks like gspca: probing 046d:08da zc3xx: probe 2wr ov vga 0x zc3xx: probe sensor - 11 zc3xx: Find Sensor HV7131R(c) gspca: probe ok usbcore: registered new interface driver zc3xx zc3xx: registered and the camera is not working, the light on its case doesn't go on. If I try force_sensor=15 to match sensor tas5130cxx, as was detected by the old driver, dmesg reports gspca: probing 046d:08da zc3xx: probe 2wr ov vga 0x zc3xx: probe sensor - 11 zc3xx: sensor forced to 15 gspca: probe ok usbcore: registered new interface driver zc3xx zc3xx: registered and otherwise nothing changes. I could spend some time trying to find the problem, but I would prefer if someone could suggest some debugging, I am not familiar with gspca internals. Ideas anyone? First of all, which app are you using to test the cam ? And are you using that app in combination with libv4l ? Also why do you say the original driver used to identify it as a tas5130cxx, the dmesg lines you pasted from gspcav1 also say it is a HV7131R Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: gspca: Logitech QuickCam Messenger worked last with external gspcav1-20071224
On 05/26/2009 02:08 PM, Guennadi Liakhovetski wrote: On Tue, 26 May 2009, Hans de Goede wrote: First of all, which app are you using to test the cam ? And are you using that app in combination with libv4l ? xawtv, no, it doesn't use libv4l, but it works with the old gspcav1-20071224. Ok, maybe it used a different v4l version, but I have v4l1_compat loaded. xawtv has known bugs making it not work with gspca (or many other properly implemented v4l drivers that is). Now those bugs are fixed in some distro's but this might very well be the cause. Try using ekiga (together with LD_PRELOAD=/v4l1compat.so) Also why do you say the original driver used to identify it as a tas5130cxx, the dmesg lines you pasted from gspcav1 also say it is a HV7131R In the old sources you see switch (vendor) { ... case 0x046d:/* Logitech Labtec */ ... switch (product) { ... case 0x08da: spca50x-desc = QCmessenger; spca50x-bridge = BRIDGE_ZC3XX; spca50x-sensor = SENSOR_TAS5130CXX; break; Hmm, weird it still prints that other message then. Anyways please try with another application both with and without the force_sensor parameter. Regards, Hans p.s. I've attached a patch to xawtv which I use in Fedora's packages. diff -Nrbu xawtv-3.95/libng/plugins/drv0-v4l2.c xawtv-3.95-OK/libng/plugins/drv0-v4l2.c --- xawtv-3.95/libng/plugins/drv0-v4l2.c2005-02-11 20:56:24.0 +0300 +++ xawtv-3.95-OK/libng/plugins/drv0-v4l2.c 2008-08-26 19:27:18.0 +0400 @@ -91,6 +91,7 @@ struct ng_video_fmtfmt_me; struct v4l2_requestbuffers reqbufs; struct v4l2_buffer buf_v4l2[WANTED_BUFFERS]; +intbuf_v4l2_size[WANTED_BUFFERS]; struct ng_video_bufbuf_me[WANTED_BUFFERS]; unsigned int queue,waiton; @@ -166,7 +167,7 @@ int rc; rc = ioctl(fd,cmd,arg); -if (0 == rc ng_debug 2) +if (rc = 0 ng_debug 2) return rc; if (mayfail errno == mayfail ng_debug 2) return rc; @@ -768,6 +769,7 @@ /* get it */ memset(buf,0,sizeof(buf)); buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; +buf.memory = V4L2_MEMORY_MMAP; if (-1 == xioctl(h-fd,VIDIOC_DQBUF,buf, 0)) return -1; h-waiton++; @@ -812,6 +814,7 @@ h-buf_v4l2[i].memory = V4L2_MEMORY_MMAP; if (-1 == xioctl(h-fd, VIDIOC_QUERYBUF, h-buf_v4l2[i], 0)) return -1; + h-buf_v4l2_size[i] = h-buf_v4l2[i].length; h-buf_me[i].fmt = h-fmt_me; h-buf_me[i].size = h-buf_me[i].fmt.bytesperline * h-buf_me[i].fmt.height; @@ -865,12 +868,16 @@ ng_waiton_video_buf(h-buf_me[i]); if (ng_debug) print_bufinfo(h-buf_v4l2[i]); - if (-1 == munmap(h-buf_me[i].data,h-buf_me[i].size)) + if (-1 == munmap(h-buf_me[i].data, h-buf_v4l2_size[i])) perror(munmap); } h-queue = 0; h-waiton = 0; +/* unrequest buffers (only needed for some drivers) */ +h-reqbufs.count = 0; +xioctl(h-fd, VIDIOC_REQBUFS, h-reqbufs, EINVAL); + /* turn on preview (if needed) */ if (h-ov_on != h-ov_enabled) { h-ov_on = h-ov_enabled; @@ -907,6 +914,17 @@ fmt-width= h-fmt_v4l2.fmt.pix.width; fmt-height = h-fmt_v4l2.fmt.pix.height; fmt-bytesperline = h-fmt_v4l2.fmt.pix.bytesperline; +/* struct v4l2_format.fmt.pix.bytesperline is bytesperline for the + main plane for planar formats, where as we want it to be the total + bytesperline for all planes */ +switch (fmt-fmtid) { +case VIDEO_YUV422P: + fmt-bytesperline *= 2; + break; +case VIDEO_YUV420P: + fmt-bytesperline = fmt-bytesperline * 3 / 2; + break; +} if (0 == fmt-bytesperline) fmt-bytesperline = fmt-width * ng_vfmt_to_depth[fmt-fmtid] / 8; h-fmt_me = *fmt;
Re: [PATCH] to libv4lconvert, to do decompression for sn9c2028 cameras
Hi, Thanks for the patch, but I see one big issue with this patch, the decompression algorithm is GPL, where as libv4l is LGPL. Any chance you could get this relicensed to LGPL ? Regards, Hans On 05/24/2009 12:12 AM, Theodore Kilgore wrote: The purpose of the following patch is to do the decompression for the Sonix SN9C2028 cameras, which are already supported as still cameras in libgphoto2/camlibs/sonix. The decompression code is essentially identical to that which is used in the libgphoto2 driver, with minor changes to adapt it for libv4lconvert. The history and antecedents of this algorithm are described in libgphoto2/camlibs/sonix/README.sonix, which was Copyright (C) 2005 Theodore Kilgore kilg...@auburn.edu, as follows: The decompression algorithm originates, I understand, in the work of Bertrik Sikkens for the sn9c102 cameras. In the macam project for MacOS-X camera support (webcam-osx project on Sourceforge), the decompression algorithm for the sn9c2028 cameras was developed by Mattias Krauss and adapted for use with the Vivitar Vivicam 3350B in particular by Harald Ruda hrx at users.sourceforge.net. Harald brought to my attention the work already done in the macam project, pointed out that it is GPL code, and invited me to have a look. Thanks, Harald. The decompression algorithm used here is similar to what is used in the macam driver, but is considerably streamlined and improved. Signed-off-by Theodore Kilgore kilg...@auburn.edu --- diff -r 276a90c8ac40 v4l2-apps/libv4l/libv4lconvert/Makefile --- a/v4l2-apps/libv4l/libv4lconvert/Makefile Wed May 20 07:23:00 2009 +0200 +++ b/v4l2-apps/libv4l/libv4lconvert/Makefile Wed May 20 13:10:53 2009 -0500 @@ -14,7 +14,7 @@ CONVERT_OBJS = libv4lconvert.o tinyjpeg.o sn9c10x.o sn9c20x.o pac207.o \ mr97310a.o flip.o crop.o jidctflt.o spca561-decompress.o \ - rgbyuv.o spca501.o sq905c.o bayer.o hm12.o \ + rgbyuv.o sn9c2028-decomp.o spca501.o sq905c.o bayer.o hm12.o \ control/libv4lcontrol.o processing/libv4lprocessing.o \ processing/rgbprocessing.o processing/bayerprocessing.o TARGETS = $(CONVERT_LIB) libv4lconvert.pc diff -r 276a90c8ac40 v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h --- a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h Wed May 20 07:23:00 2009 +0200 +++ b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert-priv.h Wed May 20 13:10:53 2009 -0500 @@ -51,6 +51,10 @@ #define V4L2_PIX_FMT_MR97310A v4l2_fourcc('M','3','1','0') #endif +#ifndef V4L2_PIX_FMT_SN9C2028 +#define V4L2_PIX_FMT_SN9C2028 v4l2_fourcc('S', 'O', 'N', 'X') +#endif + #ifndef V4L2_PIX_FMT_SQ905C #define V4L2_PIX_FMT_SQ905C v4l2_fourcc('9', '0', '5', 'C') #endif @@ -193,6 +197,9 @@ void v4lconvert_decode_mr97310a(const unsigned char *src, unsigned char *dst, int width, int height); +void v4lconvert_decode_sn9c2028(const unsigned char *src, unsigned char *dst, + int width, int height); + void v4lconvert_decode_sq905c(const unsigned char *src, unsigned char *dst, int width, int height); diff -r 276a90c8ac40 v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c --- a/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c Wed May 20 07:23:00 2009 +0200 +++ b/v4l2-apps/libv4l/libv4lconvert/libv4lconvert.c Wed May 20 13:10:53 2009 -0500 @@ -60,6 +60,7 @@ { V4L2_PIX_FMT_JPEG, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_SPCA561, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_SN9C10X, V4LCONVERT_COMPRESSED }, + { V4L2_PIX_FMT_SN9C2028, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_PAC207, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_MR97310A, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_SQ905C, V4LCONVERT_COMPRESSED }, @@ -460,6 +461,7 @@ case V4L2_PIX_FMT_SN9C10X: case V4L2_PIX_FMT_PAC207: case V4L2_PIX_FMT_MR97310A: + case V4L2_PIX_FMT_SN9C2028: case V4L2_PIX_FMT_SQ905C: case V4L2_PIX_FMT_SBGGR8: case V4L2_PIX_FMT_SGBRG8: @@ -672,6 +674,7 @@ case V4L2_PIX_FMT_SN9C10X: case V4L2_PIX_FMT_PAC207: case V4L2_PIX_FMT_MR97310A: + case V4L2_PIX_FMT_SN9C2028: case V4L2_PIX_FMT_SQ905C: { unsigned char *tmpbuf; @@ -699,6 +702,10 @@ v4lconvert_decode_mr97310a(src, tmpbuf, width, height); tmpfmt.fmt.pix.pixelformat = V4L2_PIX_FMT_SBGGR8; break; + case V4L2_PIX_FMT_SN9C2028: + v4lconvert_decode_sn9c2028(src, tmpbuf, width, height); + src_pix_fmt = V4L2_PIX_FMT_SBGGR8; + break; case V4L2_PIX_FMT_SQ905C: v4lconvert_decode_sq905c(src, tmpbuf, width, height); tmpfmt.fmt.pix.pixelformat = V4L2_PIX_FMT_SRGGB8; diff -r 276a90c8ac40 v4l2-apps/libv4l/libv4lconvert/sn9c2028-decomp.c --- /dev/null Thu Jan 01 00:00:00 1970 + +++ b/v4l2-apps/libv4l/libv4lconvert/sn9c2028-decomp.c Wed May 20 13:10:53 2009 -0500 @@ -0,0 +1,158 @@ +/* + * sn9c2028-decomp.c + * + * Decompression function for the Sonix SN9C2028 dual-mode cameras. + * + * Code adapted from libgphoto2/camlibs/sonix, original version of which was + * Copyright (c) 2005 Theodore Kilgore kilg...@auburn.edu + * + * History: + * + * This decoding algorithm originates from the work of Bertrik Sikken for the + *
Re: What is libv4lconvert/sn9c202x.c for?
On 05/19/2009 10:35 PM, Theodore Kilgore wrote: I can not seem to be able to find any such devices which use this. So perhaps I am not looking in the right place and someone could point me there. In any event, it appears to me to have absolutely nothing at all to do with the decompression algorithm required by the SN9C2028 cameras. Those require a differential Huffman encoding scheme similar to what is in use for the MR97310a cameras, but with a few crucial differencew which make it pretty much impossible to write one routine for both. But the code in the file libv4lconvert/sn9c202x.c appears to me to be no differential Huffman scheme at all but something entirely different. Hence my question. This is for the (not yet in the mainline kernel) sn9c20x driver, just like there is a series of sn9c10x webcam bridges from sonix there also is a serie of 2n9c20x, these can do jpeg compression, but also their own custom (less CPU the decompress) YUV based compression, which is supported by libv4l, and that is what is in the sn9c20x.c file, also note the file is called sn9c20x.c not sn9c202x.c, iow this is completely unrelated to the sn9c2028 cameras, as this is not for sn9c202x but for sn9c20x . Hope this helps to clarify things. Regards, Hans p.s. The sn9c20x driver can be found here: https://groups.google.com/group/microdia Its developers are quite active I wish they would get it merged into the mainline (and preferably first converted to a gspca subdriver, I'm not saying gspca is perfect, but it does safe a lot of code duplication). -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Preliminary results with an SN9C2028 camera
On 05/16/2009 12:31 AM, Theodore Kilgore wrote: I decided recently to work on support for the SN9C2028 dual-mode cameras, which are supported as still cameras in libgphoto2/camlibs/sonix. Today, I succeeded in getting three frames out of one of them, using svv -gr, and I was able to convert two of the three frames to nice images using the same decompression algorithm which is used for the cameras in stillcam mode. There is a lot of work to do yet: support for all appropriate resolution settings (which are what? I do not yet know), support for all known cameras for which I can chase down an owner, and incorporation of the decompression code in libv4l. However, I thought you might like to know that some success has been achieved. Cool! I recently got a vivitar mini digital camera, usb id 093a 010e, CD302N according to gphoto, which also is a dual mode camera. It would be nice to get the webcam mode on this one supported too. Do you know if there has already been some base work done on that ? Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH][libv4l] Support V4L2_CTRL_FLAG_NEXT_CTRL for fake controls
On 04/19/2009 12:45 AM, Adam Baker wrote: The fake controls added by libv4l to provide whitebalance on some cameras do not respect the V4L2_CTRL_FLAG_NEXT_CTRL and hence don't appear on control programs that try to use that flag if there are any driver controls that do support the flag. Add support for V4L2_CTRL_FLAG_NEXT_CTRL Signed-off-by: Adam Bakerli...@baker-net.org.uk Thanks, reviewed and tested looks fine, so it has been applied to my tree: http://linuxtv.org/hg/~hgoede/libv4l And will be in the next libv4l release. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: working on webcam driver
On 05/14/2009 06:00 PM, MK wrote: Since I'm cross-posting this (as advised) I should introduce myself by saying I am a neophyte C programmer getting into kernel programming by trying to write a driver for an unsupported webcam. So far I've gotten the device to register and have enumerated the various interfaces. On 05/11/2009 02:50:00 PM, Erik Andrén wrote: First of all, this list is deprecated. Send mails to linux-media@vger.kernel.org if you want to reach the kernel community. Secondly, have you researched that there is no existing driver for your camera? A good way to start would perhaps to search for the usb id and linux in google to see if it generates some hits. I've done this last bit already, and I just checked out gspca. There is a lot of listing for the vendor id, but not the product id, so I imagine there is no point in trying any of the drivers (unless I hack the source to accept the id string). However, a rather unhelpful person at the linux driver backport group informs me not all USB video drivers are under drivers/media/video/usbvideo/ In fact, the majority of them are not. but then tells me I should take off and go find them myself with a web browser (very nice). Does anyone know where these drivers are located? Most non yvc (see below) usb webcams are driven through the gspca usb webcam driver framework. This is a v4l driver which consists of gspca-core + various subdrivers for a lot of bridges, see drivers/media/video/gspca The same person also claims that the kernel now has support for all devices that follow the USB video class specification, and [sic] that the additional 23 device specific drivers in the tree* are just for non-conforming devices. This is correct recently the USB consortium (or whatever they are called) have created a new spec called UVC, this is one standard protocol for all webcams to follow. All *newer* webcams use this, but a lot of cams still in the stores predate UVC (which stands for USB Video Class). The first thing to find out to get your webcam supported is what kind of bridge chip it is using, try looking at the windows driver .inf file, typical bridges are the sonix series (often refenced to as sn9c10x or sn9c20x), spca5xx series, zc3xx, vc032x, etc. If you see a reference to anything like this in the windows driver .inf file (or inside dll's) that would be good to know. Also it would be very helpful to have the usb id of your camera. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH] libv4l: spca508 UV inversion
On 05/10/2009 09:11 AM, Jean-Francois Moine wrote: On Tue, 5 May 2009 11:08:46 +0200 Jean-Francois Moinemoin...@free.fr wrote: People with a spca508 webcam report a color inversion in the images. Here is a simple patch to fix this problem. Hello Hans, Sorry, this is not true for all webcams. Please, don't apply the patch! Ok, Let me know when you've found out to which webcams it does apply, I guess we need a new pixformat define for those ? Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/20/2009 06:43 AM, Erik Andrén wrote: Hans de Goede wrote: On 04/19/2009 09:20 PM, Erik Andrén wrote: 2009/4/19 Hans de Goedehdego...@redhat.com: On 04/18/2009 04:40 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/17/2009 09:27 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/16/2009 10:46 PM, Adam Baker wrote: On Thursday 16 Apr 2009, Hans de Goede wrote: On 04/16/2009 12:26 AM, Adam Baker wrote: On Wednesday 15 Apr 2009, Hans de Goede wrote: Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Strangely while those instructions give me a whitebalance control for the sq905 based camera I can't get it to appear for a pac207 based camera regardless of whether LIBV4LCONTROL_CONTROLS is set. Thats weird, there is a small bug in the handling of pac207 cams with usb id 093a:2476 causing libv4l to not automatically enable whitebalancing (and the control) for cams with that id, but if you have LIBV4LCONTROL_CONTROLS set (exported!) both when loading v4l2ucp (you must preload v4l2convert.so!) and when loading your viewer, then it should work. I've tested it by plugging in the sq905 camera, verifying the whitebablance control is present and working, unplugging the sq905 and plugging in the pac207 and using up arrow to restart v4l2ucp and svv so I think I've eliminated most finger trouble possibilities. The pac207 is id 093a:2460 so not the problem id. I'll have to investigate more thoroughly later. Does the pac207 perhaps have a / in its card string (see v4l-info output) ? if so try out this patch: http://linuxtv.org/hg/~hgoede/libv4l/rev/1e08d865690a I have the same issue as Adam when trying to test this with my gspca_stv06xx based Quickcam Web camera i. e no whitebalancing controls show up. I'm attaching a dump which logs all available pixformats and v4l2ctrls showing that libv4l is properly loaded. (And yes, LIBV4LCONTROL_CONTROLS is exported and set to 15). Best regards, Erik Ah, you are using v4l2-ctl, not v4l2ucp, and that uses V4L2_CTRL_FLAG_NEXT_CTRL control enumeration. My code doesn't handle V4L2_CTRL_FLAG_NEXT_CTRL (which is a bug). I'm not sure when I'll have time to fix this. Patches welcome, or in the mean time use v4l2ucp to play with the controls. Actually, I've tried to use both without finding the controls. I've only tried with v4l2ucp v. 1.2. Is 1.3 necessary? Apparently there are different versions of v4l2ucp in different distro's and some do use the V4L2_CTRL_FLAG_NEXT_CTRL, just like v4l2-ctl. See Adam Baker's patch later in this thread. Which I will apply to my tree after I've reviewed it (when I find some time currently I've a lot of $work$ ) Applying Adam Bakers patch makes the control appear _but_ I can't seem to make out any difference when any of the whitebalancing and normalize options, regardless of how i tweak the max / min values. Did you also do the export LIBV4LCONTROL_CONTROLS=15 In the terminal from where you are starting the viewing application ? Yes. Hmm, Then the camera you are using probably already has some whitebalancing itself using the same algorithm. What happens if you enable normalize and then lower the high bound significantly? If that doesn't do anything either then somehow things are not working. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/18/2009 04:40 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/17/2009 09:27 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/16/2009 10:46 PM, Adam Baker wrote: On Thursday 16 Apr 2009, Hans de Goede wrote: On 04/16/2009 12:26 AM, Adam Baker wrote: On Wednesday 15 Apr 2009, Hans de Goede wrote: Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Strangely while those instructions give me a whitebalance control for the sq905 based camera I can't get it to appear for a pac207 based camera regardless of whether LIBV4LCONTROL_CONTROLS is set. Thats weird, there is a small bug in the handling of pac207 cams with usb id 093a:2476 causing libv4l to not automatically enable whitebalancing (and the control) for cams with that id, but if you have LIBV4LCONTROL_CONTROLS set (exported!) both when loading v4l2ucp (you must preload v4l2convert.so!) and when loading your viewer, then it should work. I've tested it by plugging in the sq905 camera, verifying the whitebablance control is present and working, unplugging the sq905 and plugging in the pac207 and using up arrow to restart v4l2ucp and svv so I think I've eliminated most finger trouble possibilities. The pac207 is id 093a:2460 so not the problem id. I'll have to investigate more thoroughly later. Does the pac207 perhaps have a / in its card string (see v4l-info output) ? if so try out this patch: http://linuxtv.org/hg/~hgoede/libv4l/rev/1e08d865690a I have the same issue as Adam when trying to test this with my gspca_stv06xx based Quickcam Web camera i. e no whitebalancing controls show up. I'm attaching a dump which logs all available pixformats and v4l2ctrls showing that libv4l is properly loaded. (And yes, LIBV4LCONTROL_CONTROLS is exported and set to 15). Best regards, Erik Ah, you are using v4l2-ctl, not v4l2ucp, and that uses V4L2_CTRL_FLAG_NEXT_CTRL control enumeration. My code doesn't handle V4L2_CTRL_FLAG_NEXT_CTRL (which is a bug). I'm not sure when I'll have time to fix this. Patches welcome, or in the mean time use v4l2ucp to play with the controls. Actually, I've tried to use both without finding the controls. I've only tried with v4l2ucp v. 1.2. Is 1.3 necessary? Apparently there are different versions of v4l2ucp in different distro's and some do use the V4L2_CTRL_FLAG_NEXT_CTRL, just like v4l2-ctl. See Adam Baker's patch later in this thread. Which I will apply to my tree after I've reviewed it (when I find some time currently I've a lot of $work$ ) Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/19/2009 09:20 PM, Erik Andrén wrote: 2009/4/19 Hans de Goedehdego...@redhat.com: On 04/18/2009 04:40 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/17/2009 09:27 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/16/2009 10:46 PM, Adam Baker wrote: On Thursday 16 Apr 2009, Hans de Goede wrote: On 04/16/2009 12:26 AM, Adam Baker wrote: On Wednesday 15 Apr 2009, Hans de Goede wrote: Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Strangely while those instructions give me a whitebalance control for the sq905 based camera I can't get it to appear for a pac207 based camera regardless of whether LIBV4LCONTROL_CONTROLS is set. Thats weird, there is a small bug in the handling of pac207 cams with usb id 093a:2476 causing libv4l to not automatically enable whitebalancing (and the control) for cams with that id, but if you have LIBV4LCONTROL_CONTROLS set (exported!) both when loading v4l2ucp (you must preload v4l2convert.so!) and when loading your viewer, then it should work. I've tested it by plugging in the sq905 camera, verifying the whitebablance control is present and working, unplugging the sq905 and plugging in the pac207 and using up arrow to restart v4l2ucp and svv so I think I've eliminated most finger trouble possibilities. The pac207 is id 093a:2460 so not the problem id. I'll have to investigate more thoroughly later. Does the pac207 perhaps have a / in its card string (see v4l-info output) ? if so try out this patch: http://linuxtv.org/hg/~hgoede/libv4l/rev/1e08d865690a I have the same issue as Adam when trying to test this with my gspca_stv06xx based Quickcam Web camera i. e no whitebalancing controls show up. I'm attaching a dump which logs all available pixformats and v4l2ctrls showing that libv4l is properly loaded. (And yes, LIBV4LCONTROL_CONTROLS is exported and set to 15). Best regards, Erik Ah, you are using v4l2-ctl, not v4l2ucp, and that uses V4L2_CTRL_FLAG_NEXT_CTRL control enumeration. My code doesn't handle V4L2_CTRL_FLAG_NEXT_CTRL (which is a bug). I'm not sure when I'll have time to fix this. Patches welcome, or in the mean time use v4l2ucp to play with the controls. Actually, I've tried to use both without finding the controls. I've only tried with v4l2ucp v. 1.2. Is 1.3 necessary? Apparently there are different versions of v4l2ucp in different distro's and some do use the V4L2_CTRL_FLAG_NEXT_CTRL, just like v4l2-ctl. See Adam Baker's patch later in this thread. Which I will apply to my tree after I've reviewed it (when I find some time currently I've a lot of $work$ ) Applying Adam Bakers patch makes the control appear _but_ I can't seem to make out any difference when any of the whitebalancing and normalize options, regardless of how i tweak the max / min values. Did you also do the export LIBV4LCONTROL_CONTROLS=15 In the terminal from where you are starting the viewing application ? Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/17/2009 09:27 PM, Erik Andrén wrote: Hans de Goede wrote: On 04/16/2009 10:46 PM, Adam Baker wrote: On Thursday 16 Apr 2009, Hans de Goede wrote: On 04/16/2009 12:26 AM, Adam Baker wrote: On Wednesday 15 Apr 2009, Hans de Goede wrote: Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Strangely while those instructions give me a whitebalance control for the sq905 based camera I can't get it to appear for a pac207 based camera regardless of whether LIBV4LCONTROL_CONTROLS is set. Thats weird, there is a small bug in the handling of pac207 cams with usb id 093a:2476 causing libv4l to not automatically enable whitebalancing (and the control) for cams with that id, but if you have LIBV4LCONTROL_CONTROLS set (exported!) both when loading v4l2ucp (you must preload v4l2convert.so!) and when loading your viewer, then it should work. I've tested it by plugging in the sq905 camera, verifying the whitebablance control is present and working, unplugging the sq905 and plugging in the pac207 and using up arrow to restart v4l2ucp and svv so I think I've eliminated most finger trouble possibilities. The pac207 is id 093a:2460 so not the problem id. I'll have to investigate more thoroughly later. Does the pac207 perhaps have a / in its card string (see v4l-info output) ? if so try out this patch: http://linuxtv.org/hg/~hgoede/libv4l/rev/1e08d865690a I have the same issue as Adam when trying to test this with my gspca_stv06xx based Quickcam Web camera i. e no whitebalancing controls show up. I'm attaching a dump which logs all available pixformats and v4l2ctrls showing that libv4l is properly loaded. (And yes, LIBV4LCONTROL_CONTROLS is exported and set to 15). Best regards, Erik Ah, you are using v4l2-ctl, not v4l2ucp, and that uses V4L2_CTRL_FLAG_NEXT_CTRL control enumeration. My code doesn't handle V4L2_CTRL_FLAG_NEXT_CTRL (which is a bug). I'm not sure when I'll have time to fix this. Patches welcome, or in the mean time use v4l2ucp to play with the controls. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Some questions about mr97310 controls (continuing previous thread on mr97310a.c)
Hi All, On 04/17/2009 12:50 AM, Theodore Kilgore wrote: On Thu, 16 Apr 2009, Thomas Kaiser wrote: snip Just how does it work to set the Compression Balance size? Is this some kind of special command sequence? Are we able to set this to whatever we want? It looks like. One can set a value from 0x0 to 0xff in the Compression Balance size register (reg 0x4a). In the pac207 Linux driver, this register is set to 0xff to turn off the compression. While we use compression 0x88 is set (I think the same value like in Windoz). Hans did play with this register and found out that the compression changes with different values. I wonder how this relates to the mr97310a. There is no such register present, that I can see. Hans, may you explain a bit more what you found out? (Yes, please.) Quoting from linux/drivers/media/video/gspca/pac207.c (easiest for me as it has been a while I looked at this): /* An exposure value of 4 also works (3 does not) but then we need to lower the compression balance setting when in 352x288 mode, otherwise the usb bandwidth is not enough and packets get dropped resulting in corrupt frames. The problem with this is that when the compression balance gets lowered below 0x80, the pac207 starts using a different compression algorithm for some lines, these lines get prefixed with a 0x2dd2 prefix and currently we do not know how to decompress these lines, so for now we use a minimum exposure value of 5 */ #define PAC207_EXPOSURE_MIN 5 #define PAC207_EXPOSURE_MAX 26 And from libv4l/libv4lconvert/pac207.c: void v4lconvert_decode_pac207(const unsigned char *inp, unsigned char *outp, int width, int height) { /* we should received a whole frame with header and EOL marker in myframe-data and return a GBRG pattern in frame-tmpbuffer remove the header then copy line by line EOL is set with 0x0f 0xf0 marker or 0x1e 0xe1 for compressed line*/ unsigned short word; int row; /* iterate over all rows */ for (row = 0; row height; row++) { word = getShort(inp); switch (word) { case 0x0FF0: memcpy(outp, inp + 2, width); inp += (2 + width); break; case 0x1EE1: inp += pac_decompress_row(inp, outp, width); break; case 0x2DD2: /* prefix for stronger compressed lines, currently the kernel driver programs the cam so that we should not get any of these */ default: /* corrupt frame */ /* FIXME add error reporting */ return; } outp += width; } So iow, the pac207 prefixes each row of a frame it sends out with 2 bytes, indication the type of compression used, 0FF0 == no compression, 1ee1 == compression currently known in libv4l But if you lower the compression balance register below 0x80, it will also send out rows prefixed with 2DD2, and we (I) have no clue how to decompress these. If we could find out how to handle these, that would be great, as we then could lower the exposure time more when in full daylight, curing the over exposure problems we have in full daylight. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/16/2009 10:46 PM, Adam Baker wrote: On Thursday 16 Apr 2009, Hans de Goede wrote: On 04/16/2009 12:26 AM, Adam Baker wrote: On Wednesday 15 Apr 2009, Hans de Goede wrote: Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Strangely while those instructions give me a whitebalance control for the sq905 based camera I can't get it to appear for a pac207 based camera regardless of whether LIBV4LCONTROL_CONTROLS is set. Thats weird, there is a small bug in the handling of pac207 cams with usb id 093a:2476 causing libv4l to not automatically enable whitebalancing (and the control) for cams with that id, but if you have LIBV4LCONTROL_CONTROLS set (exported!) both when loading v4l2ucp (you must preload v4l2convert.so!) and when loading your viewer, then it should work. I've tested it by plugging in the sq905 camera, verifying the whitebablance control is present and working, unplugging the sq905 and plugging in the pac207 and using up arrow to restart v4l2ucp and svv so I think I've eliminated most finger trouble possibilities. The pac207 is id 093a:2460 so not the problem id. I'll have to investigate more thoroughly later. Does the pac207 perhaps have a / in its card string (see v4l-info output) ? if so try out this patch: http://linuxtv.org/hg/~hgoede/libv4l/rev/1e08d865690a Thanks Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l release: 0.5.97: the whitebalance release!
On 04/16/2009 08:16 AM, Gilles Gigan wrote: Hans, I have tested libv4lconvert with a PCI hauppauge hvr1300 DVB-T and found that v4lconvert_create() returns NULL. The problem comes from the shm_open calls in v4lcontrol_create() in libv4lcontrol.c (lines 187 190). libv4lconvert constructs the shared memory name based on the video device's name. And in this case the video device's name (literally Hauppauge WinTV-HVR1300 DVB-T/H) contains a slash, which makes both calls to shm_open() fail. I can put together a quick patch to replace '/' with '-' or white spaces if you want. Gilles Hi, Thanks for reporting this! Can you please test the attached patch to see if it fixes this? Thanks, Hans On Wed, Apr 15, 2009 at 10:36 PM, Hans de Goedej.w.r.dego...@hhs.nl wrote: Hi All, As the version number shows this is a beta release of the 0.6.x series, the big change here is the addition of video processing to libv4l currently this only does whitebalance and normalizing (which turns out to be useless for most cams) but the basic framework for doing video processing, and being able to control it through fake v4l2 controls using for example v4l2ucp is there. Currently only whitebalancing is enabled and only on Pixarts (pac) webcams (which benefit tremendously from this). To test this with other webcams (after instaling this release) do: export LIBV4LCONTROL_CONTROLS=15 LD_PRELOAD=/usr/lib/libv4l/v4l2convert.so v4l2ucp Notice the whitebalance and normalize checkboxes in v4l2ucp, as well as low and high limits for normalize. Now start your favorite webcam viewing app and play around with the 2 checkboxes. Note normalize seems to be useless in most cases. If whitebalancing makes a *strongly noticable* difference for your webcam please mail me info about your cam (the usb id), then I can add it to the list of cams which will have the whitebalancing algorithm (and the v4l2 control to enable/disable it) enabled by default. Unfortunately doing videoprocessing can be quite expensive, as for example whitebalancing is quite hard todo in yuv space, so doing white balancing with the pac7302, with an apps which wants yuv changes the flow from pixart-jpeg - yuv420 - rotate90 to: pixart-jpeg - rgb24 - whitebalance - yuv420 - rotate90 This is not a problem for cams which deliver (compressed) raw bayer, as bayer is rgb too, so I've implemented a version of the whitebalancing algorithm which operates directly on bayer data, so for bayer cams (like the pac207) it goes from: bayer- yuv to: bayer - whitebalance - yuv For the near future I plan to change the code so that the analyse phase (which does not get done every frame) creates per component look up tables, this will make it easier to stack multiple effects in one pass without special casing it as the current special normalize+whitebalance in one pass code. Then we can add for example gamma correction with a negligible performance impact (when already doing white balancing that is). libv4l-0.5.97 - * As the version number shows this is a beta release of the 0.6.x series, the big change here is the addition of video processing to libv4l currently this only does whitebalance and normalizing (which turns out to be useless for most cams) but the basic framework for doing video processing, and being able to control it through fake v4l2 controls using for example v4l2ucp is there. The initial version of this code was written by 3 of my computer science students: Elmar Kleijn, Sjoerd Piepenbrink and Radjnies Bhansingh * Currently whitebalancing gets enabled based on USB-ID's and it only gets enabled for Pixart webcam's. You can force it being enabled with other webcams by setting the environment variable LIBV4LCONTROL_CONTROLS, this sets a bitmask enabling certain v4l2 controls which control the video processing set it to 15 to enable both whitebalancing and normalize. You can then change the settings using a v4l2 control panel like v4l2ucp * Only report / allow supported destination formats in enum_fmt / try_fmt / g_fmt / s_fmt when processing, rotating or flipping. * Some applications / libs (*cough* gstreamer *cough*) will not work correctly with planar YUV formats when the width is not a multiple of 8, so crop widths which are not a multiple of 8 to the nearest multiple of 8 when converting to planar YUV * Add dependency generation to libv4l by: Gilles Gigan gilles.gi...@gmail.com * Add support to use orientation from VIDIOC_ENUMINPUT by: Adam Bakerli...@baker-net.org.uk * sn9c20x cams have occasional bad jpeg frames, drop these to avoid the flickering effect they cause, by: Brian Johnsonbrij...@gmail.com * adjust libv4l's upside down cam detection to also work with devices which have the usb interface as parent instead of the usb device * fix libv4l upside down detection for the new v4l minor numbering scheme * fix reading outside of the source memory when doing yuv420-rgb conversion Get it here:
Re: libv4l: Possibility of changing the current pixelformat on the fly
On 04/05/2009 06:53 PM, Theodore Kilgore wrote: On Sun, 5 Apr 2009, Hans de Goede wrote: On 04/05/2009 01:26 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hans de Goede wrote: On 04/04/2009 10:22 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi, While trying to get hflip and vflip working for the stv06xx webcam bridge coupled to the vv6410 sensor I've come across the following problem. When flipping the image horizontally, vertically or both, the sensor pixel ordering changes. In the m5602 driver I was able to compensate for this in the bridge code. In the stv06xx I don't have this option. One way of solving this problem is by changing the pixelformat on the fly, i. e V4L2_PIX_FMT_SGRB8 is the normal format. When a vertical flip is required, change the format to V4L2_SBGGR8. My current understanding of libv4l is that it probes the pixelformat upon device open. In order for this to work we would need either poll the current pixelformat regularly or implement some kind of notification mechanism upon a flipping request. What do you think is this the right way to go or is there another alternative. The changing of the pixelformat only happens when you flip the data before conversion. If you look at the current upside down handling code you will see it does the rotate 180 degrees after conversion. This is how the vflip / hflip should be handled too. We only have 4 (2 really since we don't care about r versus b / u versus v while flippiing) destination formats for which we then need to write flipping code. Otherwise we need to write flipping code for *all* supported input formats, not to mention flipping some input formats is close to impossible (JPEG for example). So you mean we should do the vflip/hflip in software, just exposing one native format? Erm, yes that is what I was saying, but that is because I was confusing things with the sq905 driver some other people are working on. Now I understand what you were trying to ask. So the problem is that the vv6410 sensor can do flipping in hardware, and then the order in which it sends out the pixels changes from gbgbgb (for example) to bgbgbg, for the lines which have blue, effectively changing the pixelformat, right? You mention the sq905 cameras, and the general problem of image flipping. You comment that the order of the data changes if any kind of flipping is done, with the result that the image format (Bayer tiling) changes. One difference I do see here is that the vv6410 sensor can do flipping in hardware which the sq905 cameras obviously can not. However, the fact that the Bayer tiling of the image must change in accordance with the flipping is equally present. And I do not see how that problem could be avoided, on any occasion when flipping is needed. This brings up an interesting question of what would be the most efficient way actually to do the required image flipping: If the flipping is done before the finished image is produced, then the Bayer tiling of the image has changed. Therefore a different treatment is needed. If the flipping is done after the finished image is produced, then there is three times as much data, and the flipping might take longer (or might not if it were done exactly right?). True, still doing the flipping after the conversion is done, is what we are currently doing for the rotate 180 case (so h-flip + v-flip) and is what I think we should also do for the regular h-flip and v-flip. Why? Simplicity! We support 4 different destination formats, which can be simplified to 3 for the flipping case (we do not need to care about uv order). So that means writing vflip + hflip + rotate 180 = 3 x 3 9 flipping routines. At the moment we support 23 different source formats, so doing flipping at the level requires 3 x 23 = 69 flipping routines of which we can shave of quite a bit by being smart here and there, but then we are still left with quite a large number. But the most important reason for me not to want to do this at the source format level, is that I do not want to make it harder to add new source formats. Currently for a new source format, conversion routines for all 4 dest formats must be written, so that is 4 new conversion routines at worst. I do not want to make adding new formats harder. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l: Possibility of changing the current pixelformat on the fly
On 04/04/2009 10:22 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi, While trying to get hflip and vflip working for the stv06xx webcam bridge coupled to the vv6410 sensor I've come across the following problem. When flipping the image horizontally, vertically or both, the sensor pixel ordering changes. In the m5602 driver I was able to compensate for this in the bridge code. In the stv06xx I don't have this option. One way of solving this problem is by changing the pixelformat on the fly, i. e V4L2_PIX_FMT_SGRB8 is the normal format. When a vertical flip is required, change the format to V4L2_SBGGR8. My current understanding of libv4l is that it probes the pixelformat upon device open. In order for this to work we would need either poll the current pixelformat regularly or implement some kind of notification mechanism upon a flipping request. What do you think is this the right way to go or is there another alternative. The changing of the pixelformat only happens when you flip the data before conversion. If you look at the current upside down handling code you will see it does the rotate 180 degrees after conversion. This is how the vflip / hflip should be handled too. We only have 4 (2 really since we don't care about r versus b / u versus v while flippiing) destination formats for which we then need to write flipping code. Otherwise we need to write flipping code for *all* supported input formats, not to mention flipping some input formats is close to impossible (JPEG for example). Regards, Hans p.s. One problem with this approach is that if an apps ask for a native cam format which is not one which we can also convert to, the flipping won't work. I think this is best solved by simply not listing the native formats in the enum-fmt output when the cam needs flipping. -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l: Possibility of changing the current pixelformat on the fly
On 04/05/2009 01:26 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hans de Goede wrote: On 04/04/2009 10:22 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi, While trying to get hflip and vflip working for the stv06xx webcam bridge coupled to the vv6410 sensor I've come across the following problem. When flipping the image horizontally, vertically or both, the sensor pixel ordering changes. In the m5602 driver I was able to compensate for this in the bridge code. In the stv06xx I don't have this option. One way of solving this problem is by changing the pixelformat on the fly, i. e V4L2_PIX_FMT_SGRB8 is the normal format. When a vertical flip is required, change the format to V4L2_SBGGR8. My current understanding of libv4l is that it probes the pixelformat upon device open. In order for this to work we would need either poll the current pixelformat regularly or implement some kind of notification mechanism upon a flipping request. What do you think is this the right way to go or is there another alternative. The changing of the pixelformat only happens when you flip the data before conversion. If you look at the current upside down handling code you will see it does the rotate 180 degrees after conversion. This is how the vflip / hflip should be handled too. We only have 4 (2 really since we don't care about r versus b / u versus v while flippiing) destination formats for which we then need to write flipping code. Otherwise we need to write flipping code for *all* supported input formats, not to mention flipping some input formats is close to impossible (JPEG for example). So you mean we should do the vflip/hflip in software, just exposing one native format? Erm, yes that is what I was saying, but that is because I was confusing things with the sq905 driver some other people are working on. Now I understand what you were trying to ask. So the problem is that the vv6410 sensor can do flipping in hardware, and then the order in which it sends out the pixels changes from gbgbgb (for example) to bgbgbg, for the lines which have blue, effectively changing the pixelformat, right? In that case I think the only solution is to simply return -EBUSY when the vflip / hflip controls are changed while a stream is active. As for the race window with one application querying the format (or even setting it) and then another app changing the flip before the application which just set the format starts the stream, we already have that wrt 2 applications doing this: app a: setformat x app b: setformat y app a: start stream (thinking format is x) Which is something which normally (luckily) never happens. Does that sound like a plan ? Note that this is a solution at the driver level, which IMHO is the only way as we cannot assume libv4l is always being used. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: libv4l: Possibility of changing the current pixelformat on the fly
On 04/05/2009 02:58 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hans de Goede wrote: On 04/05/2009 01:26 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hans de Goede wrote: On 04/04/2009 10:22 PM, Erik Andrén wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi, While trying to get hflip and vflip working for the stv06xx webcam bridge coupled to the vv6410 sensor I've come across the following problem. When flipping the image horizontally, vertically or both, the sensor pixel ordering changes. In the m5602 driver I was able to compensate for this in the bridge code. In the stv06xx I don't have this option. One way of solving this problem is by changing the pixelformat on the fly, i. e V4L2_PIX_FMT_SGRB8 is the normal format. When a vertical flip is required, change the format to V4L2_SBGGR8. My current understanding of libv4l is that it probes the pixelformat upon device open. In order for this to work we would need either poll the current pixelformat regularly or implement some kind of notification mechanism upon a flipping request. What do you think is this the right way to go or is there another alternative. The changing of the pixelformat only happens when you flip the data before conversion. If you look at the current upside down handling code you will see it does the rotate 180 degrees after conversion. This is how the vflip / hflip should be handled too. We only have 4 (2 really since we don't care about r versus b / u versus v while flippiing) destination formats for which we then need to write flipping code. Otherwise we need to write flipping code for *all* supported input formats, not to mention flipping some input formats is close to impossible (JPEG for example). So you mean we should do the vflip/hflip in software, just exposing one native format? Erm, yes that is what I was saying, but that is because I was confusing things with the sq905 driver some other people are working on. Glad that you were confused as I couldn't really make out the meaning of your answer. :) Now I understand what you were trying to ask. So the problem is that the vv6410 sensor can do flipping in hardware, and then the order in which it sends out the pixels changes from gbgbgb (for example) to bgbgbg, for the lines which have blue, effectively changing the pixelformat, right? Correct. In that case I think the only solution is to simply return -EBUSY when the vflip / hflip controls are changed while a stream is active. So this effectively forces a reprobe of the hardware? No, it just disallows changing the flipping (and thus the format) while an app is actively streaming. so one can use something like v4l2ucp, and change the flipping while the cam is not being used otherwise, but if for example skype or cheese or whatever is streaming data from the cam then the flip will be disallowed (and a device busy error will be returned). This may not seem very user friendly but normally one does not want to mess with the flip controls anyways unless the cam is mounted upside down or something like that. As for the race window with one application querying the format (or even setting it) and then another app changing the flip before the application which just set the format starts the stream, we already have that wrt 2 applications doing this: app a: setformat x app b: setformat y app a: start stream (thinking format is x) Which is something which normally (luckily) never happens. If this ever becomes a real issue, shouldn't some kind of user-space master program aquire the device and then handle format requests for all client programs? Yes, one of these days we need to think (very hard) about a much better way for sharing v4l devices. Does that sound like a plan ? Yes, I'll try this ASAP. Note that this is a solution at the driver level, which IMHO is the only way as we cannot assume libv4l is always being used. If it works this sounds like the most clean and elegant solution. The weak point being that the application must try to renegotiate the format upon the -EBUSY signal. No, the idea is that the flipping will simply not change (hence the error, signaling the setting of the flip control failed). Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH v2 4/4] Add support to libv4l to use orientation from VIDIOC_ENUMINPUT
On 03/30/2009 12:28 AM, Adam Baker wrote: Add check to libv4l of the sensor orientation as reported by VIDIOC_ENUMINPUT Signed-off-by: Adam Bakerli...@baker-net.org.uk Looks good, thanks. I'll apply this to my libv4l tree, as soon as its certain that the matching kernel changes will go in to the kernel without any API changes. Thanks Regards, Hans --- diff -r a647c2dfa989 v4l2-apps/lib/libv4l/libv4lconvert/libv4lconvert.c --- a/v4l2-apps/lib/libv4l/libv4lconvert/libv4lconvert.cTue Jan 20 11:25:54 2009 +0100 +++ b/v4l2-apps/lib/libv4l/libv4lconvert/libv4lconvert.cSun Mar 29 22:59:56 2009 +0100 @@ -29,6 +29,11 @@ #define MIN(a,b) (((a)(b))?(a):(b)) #define ARRAY_SIZE(x) ((int)sizeof(x)/(int)sizeof((x)[0])) +/* Workaround this potentially being missing from videodev2.h */ +#ifndef V4L2_IN_ST_VFLIP +#define V4L2_IN_ST_VFLIP 0x0020 /* Output is flipped vertically */ +#endif + /* Note for proper functioning of v4lconvert_enum_fmt the first entries in supported_src_pixfmts must match with the entries in supported_dst_pixfmts */ #define SUPPORTED_DST_PIXFMTS \ @@ -134,6 +139,7 @@ int i, j; struct v4lconvert_data *data = calloc(1, sizeof(struct v4lconvert_data)); struct v4l2_capability cap; + struct v4l2_input input; if (!data) return NULL; @@ -161,6 +167,13 @@ /* Check if this cam has any special flags */ data-flags = v4lconvert_get_flags(data-fd); + if ((syscall(SYS_ioctl, fd, VIDIOC_G_INPUT,input.index) == 0) + (syscall(SYS_ioctl, fd, VIDIOC_ENUMINPUT,input) == 0)) { +/* Don't yet support independent HFLIP and VFLIP so getting + * image the right way up is highest priority. */ +if (input.status V4L2_IN_ST_VFLIP) + data-flags |= V4LCONVERT_ROTATE_180; + } if (syscall(SYS_ioctl, fd, VIDIOC_QUERYCAP,cap) == 0) { if (!strcmp((char *)cap.driver, uvcvideo)) data-flags |= V4LCONVERT_IS_UVC; -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
[PATCH]: gspca: use usb interface as parent
Hi all, As discussed in the: v4l parent for usb device interface or device? thread, here is a patch for gspca to make it use the usb interface as its parent device, instead of the usb device. Regards, Hans p.s. I'll also push a patch to my libv4l repo, with matching libv4l changes so that libv4l's upside down cam detections stays working with this change. Note: this libv4l patch also fixes libv4l upside down detection for the new device numbering style. diff -r c28651a2c2c3 linux/drivers/media/video/gspca/gspca.c --- a/linux/drivers/media/video/gspca/gspca.c Thu Mar 26 09:44:15 2009 +0100 +++ b/linux/drivers/media/video/gspca/gspca.c Fri Mar 27 10:32:24 2009 +0100 @@ -1958,7 +1958,7 @@ /* init video stuff */ memcpy(gspca_dev-vdev, gspca_template, sizeof gspca_template); - gspca_dev-vdev.parent = dev-dev; + gspca_dev-vdev.parent = intf-dev; gspca_dev-module = module; gspca_dev-present = 1; ret = video_register_device(gspca_dev-vdev,
Re: v4l parent for usb device interface or device?
On 03/25/2009 03:58 PM, Hans Verkuil wrote: Hi Hans, On Wednesday 25 March 2009 11:18:31 Hans de Goede wrote: take 2 this time to the new list, hoping it gets some more attention Hi, Today it came to my attention (through a libv4l bugreport) that the uvc driver and the gspca driver handle the setting of the v4l parent for usb webcams differently. The probe function for an usb driver gets passed in a struct usb_interface *intf parameter. uvc sets parent to: vdev-parent =intf-dev; gspca uses: struct usb_device *dev = interface_to_usbdev(intf); vdev.parent =dev-dev; Looking at what for example the usb mass-storage driver does (with my multi function printer/scanner with cardreader), which matches UVC, and thinking about how this is supposed to work with multifunction devices in general, I believe the uvc driver behaviour is correct, but before writing a patch for gspca, I thought it would be good to first discuss this on the list. So what do you think ? I obviously agree with you :-) USB class drivers bind to interfaces instead of devices to support composite (multifunction) devices. While drivers for vendor-specific USB devices can bind to the device, in which case the parent could be a USB device, we need to have some consistency in the sysfs symlinks. Using a USB interface as the video device parent regardless of the device type makes sense. If the parent should indeed become the usb_interface, then we should make all v4l usb drivers consistent. And update v4l2-framework.txt. I've noticed before that it seems to be random what is used as the parent. I'm no USB expert, so I'm relying on your input. I believe that what uvc is doing, is the right thing. USB explicitly allows for multi-function devices, where each function has a seperate interface. So far example a still camera, with a webcam mode, could have 2 interfaces, a mass storage interface for the sdcard which stores the pictures and a foo interface, for the webcam mode. Clearly the right parent for the webcam v4l device then is the foo interface, and not the entire device (just like the mass storage driver will use the other interface as its parent). I think writing some docs about this and making all drivers consistent wrt this, is a good plan. I will write a driver to make gspca the right thing. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [RFC][PATCH 0/2] Sensor orientation reporting
Adam Baker wrote: On Monday 16 March 2009, Hans de Goede wrote: Both patches look good to me. A complaint about lack of documentation wouldn't have gone amiss. Er, good point. Regards, Hans Unfortunately having just remembered that I should have done that I'm struggling to get the current docbook to compile (So far I've suffered Ubuntu not packaging an old enough docbook, missing character set definition files and the Makefile depending on bash but not explicitly requesting it so getting dash). It looks like it now builds the docs so I'm ready to start updating them. Adam -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PATCH] libv4lconvert support for SQ905C decompression (revised)
kilg...@banach.math.auburn.edu wrote: Hans, From an abundance of caution, I thought I had better run the v4lconvert patch which supports the SQ905C compressed format through the checkpatch.pl process, too. The result of that process appears here, below the signed-off-by line. Theodore Kilgore Thanks applied to me tree: http://linuxtv.org/hg/~hgoede/libv4l I expect to release libv4l-0.5.9 with this in, soon. Regards, Hans -- Forwarded message -- Date: Sun, 1 Mar 2009 17:45:32 -0600 (CST) From: kilg...@banach.math.auburn.edu To: Hans de Goede hdego...@redhat.com, linux-media@vger.kernel.org Subject: [PATCH] libv4lconvert support for SQ905C decompression Hans, Below is a patch for libv4lconvert, to support the decompression used by the SQ905C cameras (0x2770:0x905C) and some other related cameras. There is at the moment no support module for these cameras in streaming mode, but I intend to submit one. This contribution was created in whole by me, based upon code in libgphoto2 which was created in whole by me, and which was licensed for libgphoto2 under the LGPL license. Signed-off-by: Theodore Kilgore kilg...@auburn.edu --- diff -uprN libv4lconvert-old/Makefile libv4lconvert-new/Makefile --- libv4lconvert-old/Makefile2009-03-01 15:37:38.0 -0600 +++ libv4lconvert-new/Makefile2009-03-04 16:22:52.0 -0600 @@ -12,7 +12,7 @@ endif CONVERT_OBJS = libv4lconvert.o tinyjpeg.o sn9c10x.o sn9c20x.o pac207.o \ mr97310a.o flip.o crop.o jidctflt.o spca561-decompress.o \ -rgbyuv.o spca501.o bayer.o +rgbyuv.o spca501.o sq905c.o bayer.o TARGETS = $(CONVERT_LIB) libv4lconvert.pc INCLUDES = ../include/libv4lconvert.h diff -uprN libv4lconvert-old/libv4lconvert-priv.h libv4lconvert-new/libv4lconvert-priv.h --- libv4lconvert-old/libv4lconvert-priv.h2009-03-01 15:37:38.0 -0600 +++ libv4lconvert-new/libv4lconvert-priv.h2009-03-04 16:22:52.0 -0600 @@ -47,6 +47,10 @@ #define V4L2_PIX_FMT_MR97310A v4l2_fourcc('M','3','1','0') #endif +#ifndef V4L2_PIX_FMT_SQ905C +#define V4L2_PIX_FMT_SQ905C v4l2_fourcc('9', '0', '5', 'C') +#endif + #ifndef V4L2_PIX_FMT_PJPG #define V4L2_PIX_FMT_PJPG v4l2_fourcc('P', 'J', 'P', 'G') #endif @@ -180,6 +184,9 @@ void v4lconvert_decode_pac207(const unsi void v4lconvert_decode_mr97310a(const unsigned char *src, unsigned char *dst, int width, int height); +void v4lconvert_decode_sq905c(const unsigned char *src, unsigned char *dst, + int width, int height); + void v4lconvert_bayer_to_rgb24(const unsigned char *bayer, unsigned char *rgb, int width, int height, unsigned int pixfmt); diff -uprN libv4lconvert-old/libv4lconvert.c libv4lconvert-new/libv4lconvert.c --- libv4lconvert-old/libv4lconvert.c2009-03-01 15:37:38.0 -0600 +++ libv4lconvert-new/libv4lconvert.c2009-03-04 16:22:52.0 -0600 @@ -61,6 +61,7 @@ static const struct v4lconvert_pixfmt su { V4L2_PIX_FMT_SN9C10X, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_PAC207, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_MR97310A, V4LCONVERT_COMPRESSED }, + { V4L2_PIX_FMT_SQ905C, V4LCONVERT_COMPRESSED }, { V4L2_PIX_FMT_PJPG, V4LCONVERT_COMPRESSED }, }; @@ -608,6 +609,7 @@ static int v4lconvert_convert_pixfmt(str case V4L2_PIX_FMT_SN9C10X: case V4L2_PIX_FMT_PAC207: case V4L2_PIX_FMT_MR97310A: +case V4L2_PIX_FMT_SQ905C: { unsigned char *tmpbuf; @@ -633,6 +635,10 @@ static int v4lconvert_convert_pixfmt(str v4lconvert_decode_mr97310a(src, tmpbuf, width, height); src_pix_fmt = V4L2_PIX_FMT_SBGGR8; break; +case V4L2_PIX_FMT_SQ905C: + v4lconvert_decode_sq905c(src, tmpbuf, width, height); + src_pix_fmt = V4L2_PIX_FMT_SRGGB8; + break; } src = tmpbuf; /* Deliberate fall through to raw bayer fmt code! */ diff -uprN libv4lconvert-old/sq905c.c libv4lconvert-new/sq905c.c --- libv4lconvert-old/sq905c.c1969-12-31 18:00:00.0 -0600 +++ libv4lconvert-new/sq905c.c2009-03-04 16:27:17.0 -0600 @@ -0,0 +1,217 @@ +/* + * sq905c.c + * + * Here is the decompression function for the SQ905C cameras. The functions + * used are adapted from the libgphoto2 functions for the same cameras, + * which was + * Copyright (c) 2005 and 2007 Theodore Kilgore kilg...@auburn.edu + * This version for libv4lconvert is + * Copyright (c) 2009 Theodore Kilgore kilg...@auburn.edu + * + * This program is free software; you can redistribute it and/or modify + * it under the terms of the GNU Lesser General Public License as published by + * the Free Software Foundation; either version 2.1 of the License, or + * (at your option) any later version. + * + * This program is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS
Re: RFC on proposed patches to mr97310a.c for gspca and v4l
kilg...@banach.math.auburn.edu wrote: On Thu, 5 Mar 2009, Hans de Goede wrote: kilg...@banach.math.auburn.edu wrote: On Thu, 5 Mar 2009, Hans de Goede wrote: Kyle Guinn wrote: On Wednesday 04 March 2009 22:34:13 kilg...@banach.math.auburn.edu wrote: On Wed, 4 Mar 2009, Kyle Guinn wrote: On Tuesday 03 March 2009 18:12:33 kilg...@banach.math.auburn.edu wrote: contents of file mr97310a.patch follow, for gspca/mr97310a.c --- mr97310a.c.old2009-02-23 23:59:07.0 -0600 +++ mr97310a.c.new2009-03-03 17:19:06.0 -0600 @@ -302,21 +302,9 @@ static void sd_pkt_scan(struct gspca_dev data, n); sd-header_read = 0; gspca_frame_add(gspca_dev, FIRST_PACKET, frame, NULL, 0); -len -= sof - data; -data = sof; -} -if (sd-header_read 7) { -int needed; - -/* skip the rest of the header */ -needed = 7 - sd-header_read; -if (len = needed) { -sd-header_read += len; -return; -} -data += needed; -len -= needed; -sd-header_read = 7; +/* keep the header, including sof marker, for coming frame */ +len -= n; +data = sof - sizeof pac_sof_marker;; } gspca_frame_add(gspca_dev, INTER_PACKET, frame, data, len); A few notes: 1. There is an extra semicolon on that last added line. Oops. My bifocals. 2. sd-header_read no longer seems necessary. This is very likely true. 3. If the SOF marker is split over two transfers then everything falls apart. Are you sure about that? Simple example: One transfer ends with FF FF 00 and the next begins with FF 96 64. pac_find_sof() returns a pointer to 64, n is set to 0, len stays the same, data now points at 3 bytes _before_ the transfer buffer, and we will most likely get undefined behavior when trying to copy the data out of the transfer buffer. Not only that, but the FF FF 00 portion of the SOF won't get copied to the frame buffer. Good point, since we will always pass frames to userspace which start with the sof, maybe we should just only pass the variable part of the header to userspace? That sure feels like the easiest solution to me. Regards, Hans Hans, that would not solve the problem. In fact, it appears to me that this problem was already inherent in the driver code before I proposed any patches at all. Erm, if I understood correctly (haven't looked yet) the driver is working with the sof detection from pac_common, which does work with a SOF split over multiple frames. That is not my impression of what the code in pac_common is doing. That code, as I understand, is totally neutral about such things. What is does is to parse some data and search for an SOF marker, and if it finds such a thing then it declares the next byte after to be what it calls sof. Specifically, there is the function static unsigned char *pac_find_sof(struct gspca_dev *gspca_dev, unsigned char *m, int len) and what it does is that it searches through unsigned char *m up to the extent declared in int len, looking for an SOF marker. If it finds one, then it returns the location of the next byte after the SOF marker has been successfully read. Check that function again, more carefully, if it fails, but finds a part of the sof at the end of m it remembers how much of the sof it has already seen and continues where it left of the next time it is called. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: RFC on proposed patches to mr97310a.c for gspca and v4l
kilg...@banach.math.auburn.edu wrote: Hans, Jean-Francois, and Kyle, The proposed patches are not very long, so I will give each of them, with my comments after each, to explain why I believe that these changes are a good idea. First, the patch to libv4lconvert is short and sweet: contents of file mr97310av4l.patch follow -- --- mr97310a.c.old2009-03-01 15:37:38.0 -0600 +++ mr97310a.c.new2009-02-18 22:39:48.0 -0600 @@ -102,6 +102,9 @@ void v4lconvert_decode_mr97310a(const un if (!decoder_initialized) init_mr97310a_decoder(); +/* remove the header */ +inp += 12; + bitpos = 0; /* main decoding loop */ - here ends the v4lconvert patch -- The reason I want to do this should be obvious. It is to preserve the entire header of each frame over in the gspca driver, and to throw it away over here. The SOF marker FF FF 00 FF 96 is also kept. The reason why all of this should be kept is that it makes it possible to look at a raw output and to know if it is exactly aligned or not. Furthermore, the next byte after the 96 is a code for the compression algorithm used, and the bytes after that in the header might be useful in the future for better image processing. In other words, these headers contain information which might be useful in the future and they should not be jettisoned in the kernel module. +1 Now, the kernel module ought to keep and send along the header and SOF marker instead of throwing them away. This is the topic of the next patch. It also has the virtue of simplifying and shortening the code in the module at the same time, because one is not going through contortions to skip over and throw away some data which ought to be kept anyway. +1 contents of file mr97310a.patch follow, for gspca/mr97310a.c --- mr97310a.c.old2009-02-23 23:59:07.0 -0600 +++ mr97310a.c.new2009-03-03 17:19:06.0 -0600 @@ -302,21 +302,9 @@ static void sd_pkt_scan(struct gspca_dev data, n); sd-header_read = 0; gspca_frame_add(gspca_dev, FIRST_PACKET, frame, NULL, 0); -len -= sof - data; -data = sof; -} -if (sd-header_read 7) { -int needed; - -/* skip the rest of the header */ -needed = 7 - sd-header_read; -if (len = needed) { -sd-header_read += len; -return; -} -data += needed; -len -= needed; -sd-header_read = 7; +/* keep the header, including sof marker, for coming frame */ +len -= n; +data = sof - sizeof pac_sof_marker;; } gspca_frame_add(gspca_dev, INTER_PACKET, frame, data, len); @@ -337,6 +325,7 @@ static const struct sd_desc sd_desc = { /* -- module initialisation -- */ static const __devinitdata struct usb_device_id device_table[] = { {USB_DEVICE(0x08ca, 0x0111)}, +{USB_DEVICE(0x093a, 0x010f)}, {} }; MODULE_DEVICE_TABLE(usb, device_table); end of mr97310a.patch - You will also notice that I have added a USB ID. As I have mentioned, I have four cameras with this ID. The story with them is that two of them will not work at all. The module will not initialize the camera. As far as the other two of them are concerned, the module and the accompanying change in libv4lconvert work very well. I have mentioned this previously, and I did not get any comment about what is good to do. So now I decided to submit the ID number in the patch. Adding the USB-ID sounds like the right thing to do. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: RFC on proposed patches to mr97310a.c for gspca and v4l
Kyle Guinn wrote: On Tuesday 03 March 2009 18:12:33 kilg...@banach.math.auburn.edu wrote: snip Just a random thought, but maybe the pac207 driver can benefit from such a change as well? It could, but it is to late for that, the pac207 driver and corresponding libv4l functionality has been out there for 2 kernel releases now, so we cannot change that. Which also makes me wonder about the same change for the mr97310a, is that cam already supported in a released kernel ? If not we MUST make sure we get this change in before 2.6.29 final, if it is I'm afraid we cannot make these changes. In that case if we ever need to header data we need to define a new PIXFMT for mr97310a with the header data, and deprecate the old one. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: RFC on proposed patches to mr97310a.c for gspca and v4l
Kyle Guinn wrote: On Wednesday 04 March 2009 22:34:13 kilg...@banach.math.auburn.edu wrote: On Wed, 4 Mar 2009, Kyle Guinn wrote: On Tuesday 03 March 2009 18:12:33 kilg...@banach.math.auburn.edu wrote: contents of file mr97310a.patch follow, for gspca/mr97310a.c --- mr97310a.c.old 2009-02-23 23:59:07.0 -0600 +++ mr97310a.c.new 2009-03-03 17:19:06.0 -0600 @@ -302,21 +302,9 @@ static void sd_pkt_scan(struct gspca_dev data, n); sd-header_read = 0; gspca_frame_add(gspca_dev, FIRST_PACKET, frame, NULL, 0); - len -= sof - data; - data = sof; - } - if (sd-header_read 7) { - int needed; - - /* skip the rest of the header */ - needed = 7 - sd-header_read; - if (len = needed) { - sd-header_read += len; - return; - } - data += needed; - len -= needed; - sd-header_read = 7; + /* keep the header, including sof marker, for coming frame */ + len -= n; + data = sof - sizeof pac_sof_marker;; } gspca_frame_add(gspca_dev, INTER_PACKET, frame, data, len); A few notes: 1. There is an extra semicolon on that last added line. Oops. My bifocals. 2. sd-header_read no longer seems necessary. This is very likely true. 3. If the SOF marker is split over two transfers then everything falls apart. Are you sure about that? Simple example: One transfer ends with FF FF 00 and the next begins with FF 96 64. pac_find_sof() returns a pointer to 64, n is set to 0, len stays the same, data now points at 3 bytes _before_ the transfer buffer, and we will most likely get undefined behavior when trying to copy the data out of the transfer buffer. Not only that, but the FF FF 00 portion of the SOF won't get copied to the frame buffer. Good point, since we will always pass frames to userspace which start with the sof, maybe we should just only pass the variable part of the header to userspace? That sure feels like the easiest solution to me. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: RFC on proposed patches to mr97310a.c for gspca and v4l
Kyle Guinn wrote: On Wednesday 04 March 2009 02:39:11 Hans de Goede wrote: Which also makes me wonder about the same change for the mr97310a, is that cam already supported in a released kernel ? If not we MUST make sure we get this change in before 2.6.29 final, if it is I'm afraid we cannot make these changes. In that case if we ever need to header data we need to define a new PIXFMT for mr97310a with the header data, and deprecate the old one. I don't believe the driver has made it to any kernel yet. Even if it has, the user would need to have an unreleased version of libv4l. I think this change would inconvenience me and Theodore at most. Let's change it now. +1 Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [RFC] How to pass camera Orientation to userspace
really big snip So, what do these two deep questions, which confound the assembled wisdom of an entire list of Linux video developers, have to do with tables in userspace? None that I can see, unless someone wants to provide a mechanism for the information, having been collected in the module, to be available to the table in userspace. I'm not saying that userspace tables would solve all problems. I'm just saying that this should be part of the solution. For sure we need to have a way for retrieving this information for devices like the sq905 cameras, where the information can't be currently be determined by userspace. In the case of sq905, this information is static, right? If so, IMO, the better approach is to use a flag at the v4l2_input, as already discussed in this thread. Yes we all seem to agree on this. Adam, since you started this thread can you write a small RFC with that solution worked out with proposed videodev2.h changes? Note that I'm only talking about input flags for the orientation problem and not the pivotting problem. I think the pivotting problem may need some more discussion. But since we all seem to be in agreement wrt to orientation problem and specifically the sq905 problem, lets do one more RFC, then everyone does a +1 to that and we move forward with this as a solution for the orientation problem. Regards, Hans p.s. For the pivotting problem I'm tending towards a special control class which contains read-only controls which are really camera properties. This will allow us to cope with any granularity of pivoting sensors. This could then also be used for in example aperture. But lets start a new thread for that. -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [RFC] How to pass camera Orientation to userspace
Trent Piepho wrote: On Mon, 23 Feb 2009, Hans de Goede wrote: Trent Piepho wrote: On Sun, 22 Feb 2009, Hans de Goede wrote: Trent Piepho wrote: On Sun, 22 Feb 2009, Hans de Goede wrote: Yes that is what we are talking about, the camera having a gravity switch (usually nothing as advanced as a gyroscope). Also the bits we are talking about are in a struct which communicates information one way, from the camera to userspace, so there is no way to clear the bits to make the camera do something. First, I'd like to say I agree with most that the installed orientation of the camera sensor really is a different concept than the current value of a gravity sensor. It's not necessary, and maybe not even desirable, to handle them in the same way. I do not see the advantage of using reserved bits instead of controls. The are a limited number of reserved bits. In some structures there are only a few left. They will run out. Then what? Packing non-standard sensor attributes and camera sensor meta-data into a few reserved bits is not a sustainable policy. Controls on the other card are not limited and won't run out. Yes but these things are *not* controls, end of discussion. The control API is for controls, not to stuff all kind of cruft in. All kind of cruft belongs in the reserved bits of whatever field it can be stuffed in? Not whatever field, these are input properties which happen to also be pretty binary so putting them in the input flags field makes plenty of sense. What is the difference? Why does it matter? Performance? Maintenance? Is there something that's not possible? I do not find end of discussion to be a very convincing argument. Well they are not controls, that is the difference, the control interface is for controls (and only for controls, end of discussion if you ask me). These are not controls but properties, they do not have a default min and max value, Camera pivot sensor ranges from 0 to 270. How is that not a min and max? they have only one *unchanging* value, there is nothing the application can Camera sensors don't have an unchanging value. And who says scan order can't change? Suppose the camera returns raw bayer format data top to bottom, but if you request yuv then an image processing section needs to kick in and that returns the data bottom to top. Yes, because hardware designers like throwing away lots of transistors to memory so they are going to put memory in the controller to buffer an entire frame and then scan out the memory buffer in different order then the sensor gave them the data, so they cannot do FIFO, so they will actually need 2 frames of memory. If the sensor is soldered upside down on the PCB that is a very much unchanging value, and an input property if you ask me. So new proposal: use 2 bits in the input flags to indicate if the input is hardwired vflipped and/or hflipped. Create a new class of controls for querying possible changing camera properties like pivoting and aperture. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [RFC] How to pass camera Orientation to userspace
kilg...@banach.math.auburn.edu wrote: On Sun, 22 Feb 2009, Hans de Goede wrote: kilg...@banach.math.auburn.edu wrote: snip Hans and Adam, I am not sure how it fits into the above discussion, but perhaps it is relevant to point out that flags can be toggled. Here is what I mean: Suppose that we have two flags 01 and 10 (i.e. 2), and 01 signifies VFLIP and 10 signifies HFLIP. Then for an ordinary camera in ordinary position these are initialized as 00. If the ordinary camera is turned in some funny way (and it is possible to know that) then one or both of these flags gets turned off. But if it is a funny camera like some of the SQ905s the initial values are 1 and 1, because the sensor is in fact mounted upside down. Now, suppose that there is some camera in the future which, just like this, has the sensor upside down, and suppose that said hypothetical camera also has the ability to know that it has been turned over so what was upside down is now right side up. Well, all that one has to do is to flip the two bits from whatever they were to have instead the opposite values! Observe that this would take care of the orientation problem both for cameras which had the sensor mounted right in the first place, and for cameras which had the sensor mounted wrong in the first place. Just use the same two bits to describe the sensor orientation, and if there is any reason (based upon some ability to know that the camera orientation is now different) that the orientation should change, then just flip the bits as appropriate. Then it would be the job of the support module to provide proper initial values only for these bits, and everything else could be done later on, in userspace. Theodore Kilgore Theodore, We want to be able to differentiate between a cam which has its sensor mounted upside down, and a cam which can be pivotted and happens to be upside down at the moment, in case of any upside down mounted sensor, we will always want to compentsate, in case of a pivotting camera wether we compensate or not could be a user preference. So in you example of an upside down mounted sensor in a pivotting encasing and the encasing is pivotted 180 degrees we would have the hflip and vflip bits set for sensor orientation and we would have the pivotted 180 degrees bit set. If the user has choosen to compensate for pivotting the default, we would do nothing. But it is important to be able to differentiate between the 2. Hans, I am not sure if we are talking past each other, or what. But I was pointing out that the initial values of two bits can indicate the default orientation of the sensor, and this can be done permanently in the module, which transmits the initial setting of those two bits to anything up the line which is interested or curious to know those initial values. The information in those two bits will definitely tell whether the sensor is mounted upside down in the camera. For example, if it is mounted upside down, then they are both set in the module to on and exported therefrom. But if the sensor is mounted correctly, then both of them are set to off and similarly exported. Now if any application for any reason (such as knowing that the camera is upside down or is pointing in the opposite direction, or into a mirror) wants to change the defaults, all it has to do is to toggle the bits. But, hmmm. Perhaps there is the question about how the app knows that the camera is upside down or is pointed in another direction. If the camera has a gyroscope inside, for example, then it could be the camera which needs to tell to the app about the current orientation, or else the app would not have any way to know ... Is this the problem, then? For that kind of thing, one might need more than two bits in order to pass the needed information. Yes that is what we are talking about, the camera having a gravity switch (usually nothing as advanced as a gyroscope). Also the bits we are talking about are in a struct which communicates information one way, from the camera to userspace, so there is no way to clear the bits to make the camera do something. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [RFC] How to pass camera Orientation to userspace
DongSoo(Nathaniel) Kim wrote: Hello Adam, I've been thinking exactly the same issue not usb but SoC based camera. I have no idea about how usb cameras work but I am quite curious about is it really possible to make proper orientation with only querying camera driver. Because in case of SoC based camera device, many of camera ISPs are supporting VFLIP, HFLIP register on their own, and we can read current orientation by reading those registers. But the problem is ISP's registers are set as not flipped at all but it physically mounted upside down, because the H/W vendor has packed the camera module upside down. (it sounds ridiculous but happens sometimes) That happens a lot with webcams too. Given that these SoC systems will come with some board specific config anyways, all that is needed is to pass some boardconfig in to the camera driver (through platform data for example) which tells the camera driver that on this board the sensor is mounted upside down. So in that case when we query orientation of camera, it returns not flipped vertically or horizontally at all but actually it turns out to be upside down. Actually we are setting camera device to be flipped for default in that case. Ack, but the right thing to do is not to set the vflip and hflip video4linux2 controls on by default, but to invert their meaning, so when the sensor is upside down, the hflip and vflip controls as seen by the application through the v4l2 API will report not flipping, but the hwcontrols will actually be set to flipping, and when an app enables flipping at the v4l2 API level it will actually gets disables at the HW level, this way the upside downness is 100% hidden from userspace. So your problem does not need any of the new API we are working on. The new API is for when the hardware cannot flip and we need to tell userspace to correct for this in software. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
kilg...@banach.math.auburn.edu wrote: huge snip Therefore, 1. Everyone seems to agree that the kernel module itself is not going to do things like rotate or flip data even if a given supported device always needs that done. However, this decision has a consequence: 2. Therefore, the module must send the information about what is needed out of the module, to whatever place is going to deal with it. Information which is known to the module but unknown anywere else must be transmitted somehow. Now there is a further consequence: 3. In view of (1) and (2) there has to be a way agreed upon for the module to pass the relevant information onward. It is precisely on item 3 that we are stuck right now. There is an immediate need, not a theoretical need but an immediate need. However, there is no agreed-upon method or convention for communication. We are no longer stuck here, the general agreement is adding 2 new buffer flags, one to indicate the driver knows the data in the buffer is vflipped and one for hflip. Then we can handle v-flipped, h-flipped and 180 degrees cameras This is agreed up on, Trent is arguing we may need more flags in the future, but that is something for the future, all we need know is these 2 flags and Hans Verkuil who AFAIK was the only one objecting to doing this with buffer flags has agreed this is the best solution. So Adam, kilgota, please ignore the rest of this thread and move forward with the driver, just add the necessary buffer flags to videodev2.h as part of your patch (It is usually to submit new API stuff with the same patch which introduces the first users of this API. I welcome libv4l patches to use these flags. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
Hans Verkuil wrote: On Monday 16 February 2009 05:04:40 Trent Piepho wrote: On Sun, 15 Feb 2009, Mauro Carvalho Chehab wrote: On Sun, 15 Feb 2009 10:29:03 +0100 Hans de Goede hdego...@redhat.com wrote: I think we should also be able to detect 90 and 270 degree rotations. Or at the very least prepare for it. It's a safe bet to assume that webcams will arrive that can detect portrait vs landscape orientation. Handling those (esp on the fly) will be rather hard as width and height then get swapped. So lets worry about those when we need to. We will need an additional flag for those cases anyways. The camera rotation is something that is already needed, at least on some embedded devices, like those cellular phones whose display changes when you rotate the device. By looking at the v4l2_buffer struct, we currently have 4 reserved bytes. It has also one flags field, with several bits not used. I can see 2 possibilities to extend the API: 1) adding V4L2_BUF_FLAG_HFLIP and V4L2_BUF_FLAG_VFLIP flags. This would work for 90, 180 and 270 rotation; HFLIP and VFLIP are only good for 0 and 180 degrees. 90 and 270 isn't the same as flipping. The problem I'm seeing here is that as people are using v4l2 for digital cameras instead of tv capture there is more and more meta-data available. Things like shutter speed, aperture, focus distance, and so on. Just look at all the EXIF data a digital camera provides. Four bytes and two flags are going to run out very quickly at this rate. It's a shame there are not 8 bytes left, as then they could be used for a pointer to an extended meta-data structure. I think we have to distinguish between two separate types of data: fixed ('the sensor is mounted upside-down', or 'the sensor always requires a hflip/vflip') and dynamic ('the user pivoted the camera 270 degrees'). The first is static data and I think we can just reuse the existing HFLIP/VFLIP controls: just make them READONLY to tell libv4l that libv4l needs to do the flipping. The second is dynamic data and should be passed through v4l2_buffer since this can change on a per-frame basis. In this case add two bits to the v4l2_buffer's flags field: V4L2_BUF_FLAG_ROTATION_MSK 0x0c00 V4L2_BUF_FLAG_ROTATION_00x V4L2_BUF_FLAG_ROTATION_90 0x0400 V4L2_BUF_FLAG_ROTATION_180 0x0800 V4L2_BUF_FLAG_ROTATION_270 0x0c00 No need to use the reserved field. This makes a lot more sense to me: static (or rarely changing) data does not belong to v4l2_buffer, that's what controls are for. And something dynamic like pivoting belongs to v4l2_buffer. This seems like a much cleaner API to me. I agree that we have static and dynamic camera properties, and that we may want to have 2 API's for them. I disagree the control API is the proper API to expose static properties, many existing applications will not handle this well. More over in the case we are discussing now, we have one type of data (sensor orientation) which can be both static or dynamic depending on the camera having 2 API's for this is just plain silly. Thus unnecessarily complicates things if some camera property can be static in some cases and dynamic in others then we should just always treat it as dynamic. This way we will only have one code path to deal with instead of two (with very interesting interactions also, what if both API's sat rotate 180 degrees, should we then not rotate at all?). This way lies madness. My conclusion: 1) Since rotation can be dynamic store it in the buffer flags 2) In the future we will most likely need an API to be able to query camera properties Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
Hans Verkuil wrote: Hans Verkuil wrote: On Monday 16 February 2009 05:04:40 Trent Piepho wrote: On Sun, 15 Feb 2009, Mauro Carvalho Chehab wrote: On Sun, 15 Feb 2009 10:29:03 +0100 Hans de Goede hdego...@redhat.com wrote: I think we should also be able to detect 90 and 270 degree rotations. Or at the very least prepare for it. It's a safe bet to assume that webcams will arrive that can detect portrait vs landscape orientation. Handling those (esp on the fly) will be rather hard as width and height then get swapped. So lets worry about those when we need to. We will need an additional flag for those cases anyways. The camera rotation is something that is already needed, at least on some embedded devices, like those cellular phones whose display changes when you rotate the device. By looking at the v4l2_buffer struct, we currently have 4 reserved bytes. It has also one flags field, with several bits not used. I can see 2 possibilities to extend the API: 1) adding V4L2_BUF_FLAG_HFLIP and V4L2_BUF_FLAG_VFLIP flags. This would work for 90, 180 and 270 rotation; HFLIP and VFLIP are only good for 0 and 180 degrees. 90 and 270 isn't the same as flipping. The problem I'm seeing here is that as people are using v4l2 for digital cameras instead of tv capture there is more and more meta-data available. Things like shutter speed, aperture, focus distance, and so on. Just look at all the EXIF data a digital camera provides. Four bytes and two flags are going to run out very quickly at this rate. It's a shame there are not 8 bytes left, as then they could be used for a pointer to an extended meta-data structure. I think we have to distinguish between two separate types of data: fixed ('the sensor is mounted upside-down', or 'the sensor always requires a hflip/vflip') and dynamic ('the user pivoted the camera 270 degrees'). The first is static data and I think we can just reuse the existing HFLIP/VFLIP controls: just make them READONLY to tell libv4l that libv4l needs to do the flipping. The second is dynamic data and should be passed through v4l2_buffer since this can change on a per-frame basis. In this case add two bits to the v4l2_buffer's flags field: V4L2_BUF_FLAG_ROTATION_MSK 0x0c00 V4L2_BUF_FLAG_ROTATION_00x V4L2_BUF_FLAG_ROTATION_90 0x0400 V4L2_BUF_FLAG_ROTATION_180 0x0800 V4L2_BUF_FLAG_ROTATION_270 0x0c00 No need to use the reserved field. This makes a lot more sense to me: static (or rarely changing) data does not belong to v4l2_buffer, that's what controls are for. And something dynamic like pivoting belongs to v4l2_buffer. This seems like a much cleaner API to me. I agree that we have static and dynamic camera properties, and that we may want to have 2 API's for them. I disagree the control API is the proper API to expose static properties, many existing applications will not handle this well. ??? And they will when exposed through v4l2_buffer? It's all new functionality, so that is a non-argument. The point is that libv4l has to be able to detect and handle oddly mounted sensors. It can do that easily through the already existing HFLIP/VFLIP controls. It's a one time check when the device is opened (does it have read-only H/VFLIP controls? If so, then libv4l knows it has to correct). Completely independent from that is the camera pivot: this is dynamic and while by default libv4l may be called upon to handle this, it should also be possible to disable this in libv4l by the application. You should definitely not mix pivoting information with sensor mount information: e.g. if you see hflip and vflip bits set, does that mean that the sensor is mounted upside down? Or that the camera is pivoted 180 degrees? That's two different things. More over in the case we are discussing now, we have one type of data (sensor orientation) which can be both static or dynamic depending on the camera having 2 API's for this is just plain silly. Thus unnecessarily complicates things if some camera property can be static in some cases and dynamic in others then we should just always treat it as dynamic. This way we will only have one code path to deal with instead of two (with very interesting interactions also, what if both API's sat rotate 180 degrees, should we then not rotate at all?). This way lies madness. I strongly disagree. Yes, if both sensor mount info and pivot info is handled completely inside libv4l, then indeed it doesn't have to rotate at all. But the application probably still wants to know that the user rotated the camera 180 degrees, if only to be able to report this situation. And this is of course even more important for the 90 and 270 degree rotations (think handhelds). My conclusion: 1) Since rotation can be dynamic store it in the buffer flags Ack. But rotation != sensor mount position. 2) In the future we will most likely need an API to be able to query camera properties For sensor mount position we have them in the form
Re: Adding a control for Sensor Orientation
Mauro Carvalho Chehab wrote: On Mon, 16 Feb 2009 10:44:03 +0100 Hans de Goede hdego...@redhat.com wrote: I've discussed this with Laurent Pinchart (and other webcam driver authors) and the conclusion was that having a table of USB-ID's + DMI strings in the driver, and design an API to tell userspace to sensor is upside down and have code for all this both in the driver and in userspace makes no sense. Esp since such a table will probably be more easy to update in userspace too. So the conclusion was to just put the entire table of cams with known upside down mounted sensors in userspace. This is currently in libv4l and making many philips webcam users happy (philips has a tendency to mount the sensor upside down). Are you saying that you have a table at libv4l for what cameras have sensors flipped? Yes. This is really ugly and proofs that the api is broken. No userspace application or library should need to do any special hack based on usb id, driver name or querycap names. Well libv4l is already pretty full of cam specific knowledge in the form of decompression algorithm's etc. Quirk tables like this are best kept in userspace, esp. when userspace is the only consumer of the information, why store information in the kernel if the kernel never uses it at all? Take a look at HAL quirks for suspend resume, wireless on/off buttons, etc. for example. In the case of flipping, kernel should provide this info for userspace, at least for the cameras it knows it is flipped (based on USB ID or any other method). In the case of DMI, it seems ok to let userspace to use the kernel DMI support to read this info and detect if the sensor were mounted flipped on a notebook, but for those cams where such info is known based on USB ID, we need to have an interface to read this information. I can see some ways for doing it: 1) via VIDIOC_QUERYCAP capabilities flag; 2) via VIDIOC_*CNTL read-only interfaces; 3) another ioctl for querying the webcam capabilities; 4) some info via sysfs interface; IMO, the easier and more adequate way for this case is creating an enumbered control. Something like: #define V4L2_CID_MOUNTED_ANGLE(V4L2_CID_CAMERA_CLASS_BASE+17) enum v4l2_mounted_angle { V4L2_CID_MOUNTED_ANGLE_0_DEGREES = 0, V4L2_CID_MOUNTED_ANGLE_90_DEGREES = 1, V4L2_CID_MOUNTED_ANGLE_180_DEGREES = 2, V4L2_CID_MOUNTED_ANGLE_270_DEGREES = 3, V4L2_CID_MOUNTED_ANGLE_VIA_DMI = 4, }; Here you are making things nice and inconsistent, so the information is in the kernel, except where it is not (the DMI case). If we move this in to the kernel, we should move it *completely* in to the kernel. I've discussed this with Laurent Pinchart, and it really makes the most sense to do this in userspace. Userspace approach: 1 table is in userspace, libv4l reads it directly, done. Kernelspace approach: 1 add a (smaller) table to *each* driver (which the driver has 0 use for) 2 add code to *each* driver to export this info 3 add code to libv4l to read this You've just created a kernel round trip for no good reason at all, and added a significant amount of code to the kernel, which can live in userspace just as well. The userspace approach is the KISS way. Also it is far easier for people to upgrade libv4l, then it is to upgrade a kernel. Given that this table will most likely change regulary the ease of updating is another argument for doing this in userspace. Also can we please STOP with coming up of new and novel ways of abusing the control API, the control API's purpose is for userspace to control v4l device settings. It is way overkill for things like communicating a few simple flags to userspace (and is a pain to use for things like that both on the kernel and the userspace side). Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
Mauro Carvalho Chehab wrote: On Mon, 16 Feb 2009 13:19:47 +0100 Hans de Goede hdego...@redhat.com wrote: Hans, Mauro Carvalho Chehab wrote: On Mon, 16 Feb 2009 10:44:03 +0100 Hans de Goede hdego...@redhat.com wrote: I've discussed this with Laurent Pinchart (and other webcam driver authors) and the conclusion was that having a table of USB-ID's + DMI strings in the driver, and design an API to tell userspace to sensor is upside down and have code for all this both in the driver and in userspace makes no sense. Esp since such a table will probably be more easy to update in userspace too. So the conclusion was to just put the entire table of cams with known upside down mounted sensors in userspace. This is currently in libv4l and making many philips webcam users happy (philips has a tendency to mount the sensor upside down). Are you saying that you have a table at libv4l for what cameras have sensors flipped? Yes. This is really ugly and proofs that the api is broken. No userspace application or library should need to do any special hack based on usb id, driver name or querycap names. Well libv4l is already pretty full of cam specific knowledge in the form of decompression algorithm's etc. That's bad. Not all approaches use libv4l, unfortunately. It will take time to port all userspace apps to use it Quite a bit of work has been done there in F-10, all applications are ported except for kopete. And kopete is being worked on. and maybe some driver authors will never accept libv4l. So far all I've had contact with have been cheering about it. They are all very happy there is a solution to move decompression out of kernelspace. Remember this all started to move decompression out of userspace. We are too late with the userspace library. This should have been released together with the first V4L2 API, in order to have a broad acceptance. As we do more and more stuff in libv4l applications will have to use libv4l, for example to work with a lot of the webcams supported by the new gspca, they need to either reimplement the decompression code for pac207 pac73100 spca501 spca505 spca507 spca561 and others I forget, or use libv4l. Also currently I've patches pending to add support for software whitebalance correction for cams which need this. Due to that, instead of having the info on just one place (at kernel), we will split this info on other places. This will lead to inconsistent support, depending on what app you're using. Apps not using libv4l will in general not work with many many cams. For example quite a few cheaper UVC cams produce YUYV packed pixel format data many apps cannot handle this. Since those info are about the hardware characteristics, IMO, kernel driver should provide such info. That is something I agree with, and was my first way of looking at this too, but Laurent managed to convince me that there is little use of having a table in kernelspace if all that is done with it is export it to userspace and kernelspace does nothing with it. Thats just obfuscation for no good reasons and added (kernel!) code for no good reasons. snip I've discussed this with Laurent Pinchart, and it really makes the most sense to do this in userspace. Userspace approach: 1 table is in userspace, libv4l reads it directly, done. Kernelspace approach: 1 add a (smaller) table to *each* driver (which the driver has 0 use for) 2 add code to *each* driver to export this info 3 add code to libv4l to read this You've just created a kernel round trip for no good reason at all, and added a significant amount of code to the kernel, which can live in userspace just as well. The userspace approach is the KISS way. Also it is far easier for people to upgrade libv4l, then it is to upgrade a kernel. Given that this table will most likely change regulary the ease of updating is another argument for doing this in userspace. I don't agree. Having an userspace library so closely bound to the kernelspace counterpart just increases overall support troubles. For example, consider adding support for camera FOO, that is mounted with 180 degrees at the kernel driver, on the trivial case where the new cam is just a new USB ID to an existing driver/chipset. The trivial case now a days is its a UVC cam, which works by USB class so no changes to the kernel are needed at all, unless the table of upside down devices lives in the kernel. See why it is a bad idea to have this in the kernel ? With a combined userspace/kernelspace, you will need to upgrade both kernelspace AND userspace, in order to properly support this cam. Nope, only libv4l will need to be updated. Now one can argue that the table should move from libv4l to a config file, then only a config file would need to be updated. This also means more work to distro, since libv4l should depend on the kernel version, and it will need to check, at runtime, for each driver specific version, complaining
Re: libv4l2 library problem
Hans Verkuil wrote: Hi Hans, On Friday 13 February 2009 13:57:45 Hans Verkuil wrote: Hi Hans, I've developed a converter for the HM12 format (produced by Conexant MPEG encoders as used in the ivtv and cx18 drivers). But libv4l2 has a problem in its implementation of v4l2_read: it assumes that the driver can always do streaming. However, that is not the case for some drivers, including cx18 and ivtv. These drivers only implement read() functionality and no streaming. Can you as a minimum modify libv4l2 so that it will check for this case? The best solution would be that libv4l2 can read HM12 from the driver and convert it on the fly. But currently it tries to convert HM12 by starting to stream, and that produces an error. This bug needs to be fixed first before I can contribute my HM12 converter. My sincere apologies: I looked at the libv4l2 code again and it was clear that it did in fact test for this case. I retested my own code and everything seems to work as it should. So libv4l2 is fine, and I will prepare a tree tomorrow containing the hm12 support for libv4lconvert. Ok, Sorry about this, No problem I didn't have time to look in to this yet :) Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
kilg...@banach.math.auburn.edu wrote: On Sat, 14 Feb 2009, Hans Verkuil wrote: On Saturday 14 February 2009 22:55:39 Hans de Goede wrote: Adam Baker wrote: Hi all, Hans Verkuil put forward a convincing argument that sensor orientation shouldn't be part of the buffer flags as then it would be unavailable to clients that use read() Yes and this is a bogus argument, clients using read also do not get things like timestamps, and vital information like which field is in the read buffer when dealing with interleaved sources. read() is a simple interface for simple applications. Given that the only user of these flags will likely be libv4l I *strongly* object to having this info in some control, it is not a control, it is a per frame (on some cams) information about how to interpret that frame, the buffer flags is a very logical place, *the* logical place even for this! The fact that there is no way to transport metadata about a frame like flags, but also timestamp and field! Is a problem with the read interface in general, iow read() is broken wrt to this. If people care add some ioctl or something which users of read() can use to get the buffer metadata from the last read() buffer, stuffing buffer metadata in a control (barf), because of read() brokenness is a very *bad* idea, and won't work in general due to synchronization problems. Doing this as a control will be a pain to implement both at the driver level, see the discussion this is causing, and in libv4l. For libv4l this will basicly mean polling the control. And hello polling is lame and something from the 1980-ies. Please just make this a buffer flag. OK, make it a buffer flag. I've got to agree that it makes more sense to do it that way. Regards, Hans -- Hans Verkuil - video4linux developer - sponsored by TANDBERG Let me take a moment to remind everyone what the problem is that brought this discussion up. Adam Baker and I are working on a driver for a certain camera. Or, better stated, for a set of various cameras, which all have the same USB Vendor:Product number. Various cameras which all have this ID have different capabilities and need different treatment of the frame data. The most particular problem is that some of the cameras require byte reversal of the frame data string, which would rotate the image 180 degrees around its center. Others of these cameras require reversal of the horizontal lines in the image (vertical 180 degree rotation of the image across a horizontal axis). The point is, one can not tell from the Vendor:Product number which of these actions is required. However, one *is* able to tell immediately after the camera is initialized, which of these actions is required. Namely, one reads and parses the response to the first USB command sent to the camera. So, for us (Adam and me) the question is simply to know how everyone will agree that the information about the image orientation can be sent from the module to V4L. When this issue is resolved, we can finish writing the sq905 camera driver. From this rather narrow point of view, the issue is not which method ought to be adopted. Rather, the issue is that no method has been adopted. It is rather difficult to write module code which will obey a non-existent standard. Ack, but the problem later was extended by the fact that it turns out some cams have a rotation detection (gravity direction) switch, which means you can flip the cam on its socket while streaming, and then the cam will tell you its rotation has changed, that makes this a per frame property rather then a static property of the cam. Which lead to this discussion, but we (the 2 Hans 's) agree now that using the flags field in the buffer struct is the best way forward. So there is a standard now, simply add 2 buffer flags to videodev2.h, one for content is h-flipped and one for content is v-flipped and you are done. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: Adding a control for Sensor Orientation
Trent Piepho wrote: On Sun, 15 Feb 2009, Hans de Goede wrote: Hans Verkuil wrote: On Sunday 15 February 2009 10:08:04 Hans de Goede wrote: kilg...@banach.math.auburn.edu wrote: On Sat, 14 Feb 2009, Hans Verkuil wrote: On Saturday 14 February 2009 22:55:39 Hans de Goede wrote: Adam Baker wrote: OK, make it a buffer flag. I've got to agree that it makes more sense to do it that way. The most particular problem is that some of the cameras require byte reversal of the frame data string, which would rotate the image 180 degrees around its center. Others of these cameras require reversal of the horizontal lines in the image (vertical 180 degree rotation of the image across a horizontal axis). The point is, one can not tell from the Vendor:Product number which of these actions is required. However, one *is* able to tell immediately after the camera is initialized, which of these actions is required. Namely, one reads and parses the response to the first USB command sent to the camera. Ack, but the problem later was extended by the fact that it turns out some cams have a rotation detection (gravity direction) switch, which means you can flip the cam on its socket while streaming, and then the cam will tell you its rotation has changed, that makes this a per frame property rather then a static property of the cam. Which lead to this discussion, but we (the 2 Hans 's) agree now that using the flags field in the buffer struct is the best way forward. So there is a standard now, simply add 2 buffer flags to videodev2.h, one for content is h-flipped and one for content is v-flipped and you are done. I think we should also be able to detect 90 and 270 degree rotations. Or at the very least prepare for it. It's a safe bet to assume that webcams will arrive that can detect portrait vs landscape orientation. Handling those (esp on the fly) will be rather hard as width and height then get swapped. So lets worry about those when we need to. We will need an additional flag for those cases anyways. Why would you need to worry about width and height getting swapped? Meta-data about the frame would indicate it's now in portrait mode vs landscape mode, but the dimentions would be unchanged. Yes, unless ofcourse you want to display a proper picture and not one on its side, when the camera is rotated 90 degrees, so somewere you need to rotate the picture 90 degrees, and the lower down in the stack you do that, the bigger the chance you do not need to duplicate the rotation code in every single app. however the app will mostlikely become unhappy when you start out pushing frames whith a changed width / height. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html
Re: [PULL] http://www.linuxtv.org/hg/~hverkuil/v4l-dvb-lib
Hans Verkuil wrote: Hi Mauro, Hans, Pending an Ack from Hans de Goede, can you please pull from http://www.linuxtv.org/hg/~hverkuil/v4l-dvb-lib for the following: - libv4l2util: rename from libv4l2 to prevent clash with the libv4l2 conversion library - v4l2-apps: move libraries around to make the directory tree flatter - v4l2-apps: rename v4l2_sysfs_path to v4l2-sysfs-path - v4l2-apps: rename capture_example to capture-example - v4l-dvb: fixup .hgignore These changes are fine by me. Regards, Hans -- To unsubscribe from this list: send the line unsubscribe linux-media in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html