Hi Yulia, I, too, am looking at HDR measurements of artificial light sources in the hope of measuring glare. The results I've been getting are rather inconsistent, so I've been doing some googling and reading.
> I was able to figure out luminance values for a single LED, which can be > compared to the ones from HDR images. But I have a couple of > questions/concerns on HDRI technique and Photosphere. > At first, I’ve used “regular” scene to retrieve response curve of the camera > (large smooth gradients with very dark and bright areas, and had reflectance > standards for the absolute calibration). > > Camera: EOS T1i Rebel with 28-105mm lens, at 28mm > Calibrated at the grey reflectance sample 186.45 cd/m2 > CF=0.957 > Then I use this RC to analyze HDRI of a captured LED. The value is 230,000 > cd/m2 for a single LED, which is low (it’s has to be around 7*106 cd/m2). > So, it underestimates the luminance. McCann and Rizzi have done some quite comprehensive research into the dynamic range of cameras and how it is limited by veiling glare. They have published an HDR book "The Art and Science of HDR Imaging" (Wiley, 2011), and many of their papers are available on http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI.html To wet your appetite, I recommend http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI_files/07EI%206492-41_1.pdf Their conclusion is that accurate HDR luminance measurements are not actually possible because veiling glare that is generated within the lens limits the dynamic range of the optical system. Apparently, there is actually an ISO standard (9358:1994) that comes to the same conclusion: The higher the dynamic range of the scene, the more inaccurate the HDR measurement. You will be aware of the 'flare removal' option in hdrgen (-f switch). I'm not entirely sure where 'flare' sits between 'point spread function' and 'veiling glare', but I believe it to be closer to the former. The PSF is a funciton of any optical system that results in the image to become 'smudged' out. Back a few years ago when the megapixel race was in full swing, many observers correctly stated that the lenses on cheap digital cameras can't actually provide a resolution that a would justify, say, 12MP on a digital snap shot camera. This is the PSF they were talking about--it's how a pixel affects the neighbouring pixels. While the PSF can be estimated (or even calculated, given enough information about the lenses and their optical properties? Not sure...), veiling glare, on the other hand, cannot because it depends on the scene. Every 'pixel' of the scene affects every pixel of the image. It's even worse than that: Even scene objects outside of the field of view of the optical system have an impact on the sensor image. Hoefflinger (Ed.) "High-Dynamic-Range (HDR) Vision" (Springer, 2007) has an entire section dedicated to HDR lenses. While true HDR low-res (video-) cameras are actually becoming commercially available (they have a logarithmic response, with a dynamic range far exceeding that of the human vision), the problem is that they require special HDR lenses that have to be carefully designed to minimise veiling glare. Digial camera lenses (even pro-level DSLR ones) are not optimised for this. So there is nothing wrong with the camera calibration that you carried out with a LDR scene. This is how it should be done. The problem you're facing is not specific to Photosphere or the Mitsunaga RSP recovery algorithm. The RSP is not compressed at the upper end--it's just Physics that you're up against. > It seems like calibration point is critical here. I’ve decided to try to > capture a different scene for deriving RC with a wider range. It would make > sense that camera has to see higher luminance values in order to accurately > measure them later. The dynamic range has to cover measured values. > > 1. 1. How does Photosphere deals/approximates/calculates the upper end of > the curve? I assume it gives more weight to mid tone values? But what > happens with high luminance values? > 2. I assumed when CF is applied, it does not equally change all values, > but does it proportionally to RC (since it is not linear). Why does it do > it equally for the whole range? > > Lsun=80*106 cd/m2. And of course CF is very big 391. > 3. Does photosphere compress the response curve, so at the upper end all > values above certain threshold will have the same number? > > 4. Any additional suggestions on properly obtaining and calibrating HDRI > for this purpose? I'm afraid you have to lower your expectations with regards to the achievable accuracy when it comes to HDR scenes that include bright light sources. Light modulation ('flicker') is another problem with HDR measurements of electric light sources. Unless you are certain that your light source is driven by a HF driver or ballast, I recommend you actually measure the modulation of the light source. If the LEDs are mains driven, they will flicker with 100 or 120 Hz, depending on your mains frequency. If the modulation factor is high, e.g. if the LEDs effectively switch on and off with this frequency, HDR measurements at short exposure times will be unpredictable. You can test this by taking a number of photographs of the same scene (with light source in it) at short exposure times. There is no need to go HDR. If all images have the same overall 'brightness', you're all right. If the images are noticeably different, you've got yet another problem. Cheers Axel _______________________________________________ HDRI mailing list [email protected] http://www.radiance-online.org/mailman/listinfo/hdri
