Hello Jim,
you have provided here so much valuable information that i doubt if i have 
found that before all in one place, THANK YOU.

Let me come to some specifics:
> What is the image format you are using in the TIFF file and what are the
>parameters of the *Models it uses to represent it in memory? (Before
> you start converting it.)
I am actually using tif file (group 4 fax encoding, 1bpp) that i save with 
ImageIO.write in the first place.
Reading back this file (with ImageIO.read) produces the following image: 
"bufferedim...@19f1bac: type = 12 IndexColorModel: #pixelBits = 1 numComponents 
= 3 color space = java.awt.color.icc_colorsp...@d1e832 transparency = 1 
transIndex   = -1 has alpha = false isAlphaPre = false BytePackedRaster: width 
= 2528 height = 3188 #channels 1 xOff = 0 yOff = 0"


Concerning the code optimization you suggested:
You were absolutely correct: the fastest and most efficient way is to use the 
getPixels() on the source raster and the setDataElements on the destination 
raster (with a new BI(TYPE_INT_RGB). This produces directly a manageable image 
in a very fast way. And i do read-write in a single line per time, so i reuse 
the same int[] buffer for reading and writing (and more, the black pixels are 
copied as they are, i don't even have to change the buffer values). (The 
drawImage method takes always 50% more).

> - let us know what image formats you are seeing in the loaded image and
> which Reader is loading it so we can consider adding loops that deal
> with that format more directly (or modifying our "format detection" code
> to recognize it if there is something amiss with our classifications).
> We do already have support for a fair number of "binary" image formats
> so I'm curious as to what the exact specifics of the format are and why
> they fail to trigger our existing optimized binary format loops. This
> should probably be done through the bug tracking system to keep records
> of the issue...
If you bare with me a little more, i would need your advice and clarification.
Please, read this scenario:
What i do in my app is get some images from a scanner (mostly binary images, 
but also gray and rgb), save them in a tif file, and then display them one by 
one. Since scanning is slow (well, 50pages-per-min maximum for now), i do have 
time while the scanner is doing it's job to do many things (e.g. pre-optimize 
images for later display). Your comment above seem to indicate that you have 
optimized loops for reading binary images directly in compatible 
BufferedImages. Is that correct? 
If yes, then my simple most important question would be:
What is the method to read a binary image from a file fast and have a 
compatible BI in memory? My tests indicate that saving a RGB BI in the BMP 
format will do the trick, but here is the catch:
Suppose a binary image of 3000X3000 pixels: in a tif file, it will take 50K on 
disc and 40ms to read back the tif file (plus another 200ms to convert to 
compatible in memory). If the same binary image is pre-converted to RGB and 
saved to a BMP, it will take about 30MB on disk and about 500ms to re-load in a 
already compatible BI, so the whole thing will not be faster (due to heavy disk 
io).

So, instead of trying to find out what is the best way to convert, it would be 
better to know what is optimized and how to use it! I understand that in the 
general case, the average jdk developer does not need to know what you optimize 
and what not, but it would be a great idea to publicize this info (even 
unofficial, i understand that you wouldn't want to write this in stone).

Costas
[Message sent by forum member 'csterg' (csterg)]

http://forums.java.net/jive/thread.jspa?messageID=349916

===========================================================================
To unsubscribe, send email to lists...@java.sun.com and include in the body
of the message "signoff JAVA2D-INTEREST".  For general help, send email to
lists...@java.sun.com and include in the body of the message "help".

Reply via email to