Nikos Alexandris wrote:

> Attached here-in a slightly updated diff (from the last diff attached in
> #774) for i.rgb.his.
>
> I need time to learn/understand things (in C). Yet, I tested the "new"
> version and it works for me.
>
> I would like to push this version into SVN myself. Thus, I'd like to ask
> (in the grass-psc list) for commit access. However, I can just submit
> another, updated, diff, in ticket #774 for someone to take over.

Moritz Lennert:

Please do that. Core svn access takes time, and in any case it is better
to have everything documented in the trac ticket.

Absolutely.


When I apply this patch I get some compiler warning that need to be
fixed first.

Also, could you please document the bit_depth parameter in i.rgb.his.html ?

One thing I don't understand about the results:

Using the NC landsat images and running

i.rgb.his r=lsat7_2002_30 g=lsat7_2002_20 bl=lsat7_2002_10 h=h8 i=i8 s=s8 
bit_depth=8

and

i.rgb.his r=lsat7_2002_30 g=lsat7_2002_20 bl=lsat7_2002_10 h=h16 i=i16 s=s16 
bit_depth=16

I then get for intensity:

> r.info -r i8
min=0.08431373
max=1

> r.info -r i16
min=0.000328069
max=0.003891051

Is this expected, or is there an issue with scaling the 16-bit version
between 0 and 1 ?

Yes, expected. The correct way to compare is:

# raw input is 8-bit
for BAND in 10 20 30; do echo "Band ${BAND}:" `r.info -r lsat7_2002_${BAND}` 
;done

Band 10: min=42 max=255
Band 20: min=28 max=255
Band 30: min=1 max=255

# thus, rescale to 16-bit
for BAND in 10 20 30; do r.rescale lsat7_2002_$BAND from=0,255 
output=lsat7_2002_${BAND}_16 to=0,65535 ;done

# then convert
i.rgb.his r=lsat7_2002_30_16 g=lsat7_2002_20_16 bl=lsat7_2002_10_16 h=h16 i=i16 
s=s16 bit_depth=16

# now "input" will be
for BAND in 10 20 30; do echo "Band ${BAND}:" `r.info -r lsat7_2002_${BAND}_16` 
;done

Band 10: min=0 max=65536
Band 20: min=0 max=65535
Band 30: min=0 max=65535

# maybe wrong the above -- should use real "min" instead of 0?

# then compare
for DIMENSION in h s i; do echo `echo "${DIMENSION}  8:" && r.info -r ${DIMENSION}8` && echo 
`echo -e "${DIMENSION} 16:" && r.info -r ${DIMENSION}16` ;done

h 8: min=0 max=359.434
h 16: min=0 max=359.434
s 8: min=0 max=1
s 16: min=0 max=1
i 8: min=0.08431373 max=1
i 16: min=0.08431373 max=1


Several points to discuss/answer:

- Autodetect input's bitness?

- Still enable the user to control input bitness -- We can imagine to
 even instruct `bit_depth=11` or `bit_depth=12`. Makes sense?

- 8-bit (or any bitness!) images should fail to convert, and report, in
 case the user tries to "fake" them as being 16-bit (or higher than
 what the input really is)

- A test for an open half ended range, ie [0, 2^bit-depth),
 is required too. That is inputs ranging in [0, 2^bit-depth] should
 fail. Makes sense?

- Still to do: set hue to NULL if chroma = 0

- More?

Nikos
_______________________________________________
grass-dev mailing list
[email protected]
http://lists.osgeo.org/mailman/listinfo/grass-dev

Reply via email to