have you tried turning gzip compression?  that should produce similar bandwidth 
savings to stripping out extra carraige returns and double spaces.
 
you could always use the jasper plugin architecture to strip out excess stuff
 
peter lin


John Sidney-Woollett <[EMAIL PROTECTED]> wrote:
Hi

We want to achieve a 10-15% data reduction of the HTML being served by our
webserver (generated by JSP pages). This will have an impact on our
bandwidth charges from our ISP...

We can achieve this by by simply removing all the "\n\r", "\t characters
and replacing repeated occurences of " " (double space) by " " (single
space). But we don't want to do this in our source JSP files as they will
become unmaintainable/unreadable.

eg


Column 1
Column 2



(69 characters)

becomes

Column 1Column 2

(57 characters), that's an 17% saving for that text block...

I know that we could:

i) write/implement a filter to process the outputstream - BUT we use
OSCache (www.opensymphony.com) to cache (included) JSP pages, and we don't
want to reprocess cached data using another filter.

ii) use a script to transform or preprocess our JSP pages before they are
deployed - simple, but may have code breaking (between dev and live
system) or maintenance implications?

iii) create a tag library to process a text block (or another JSP), BUT
we've heard a rumour that taglibs can be inefficient (is that true?)

Question: is it possible to use a directive in a JSP page to force the
compiler to remove these characters to achieve our desired data reduction?

Are there any other techniques or solutions that anyone else is using?

Thanks

John Sidney-Woollett

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---------------------------------
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard

Reply via email to