Thanks Rod,

I tried a few compression utilities and this seemed the best behaved (ahem)
and most flexible as I have to draw up to 800 files into the one zip. Any
alternatives?

I/O exception, do you mean it may be tripping up on gathering and writing
the files? Nothing appears in any of the OS syslogs
...

How do I catch the stack dump?

Since I have 2Gb in physical memory hanging around and the major task for
this server is this compression function can/should I allocate more to this
'stream buffer'.

Here is the cfstat if it sheds any light:

The two 7k times are each time the compression function is called to zip 750
files to make a 20mb zip, each time I tried it, of course, it didn't fail.
Pretty quite out there on the site, huh?

Sec  DB/Sec  CP/Sec  Reqs  Reqs  Reqs  AvgQ   AvgReq AvgDB  Bytes  Bytes
Now Hi  Now Hi  Now Hi  Q'ed  Run'g TO'ed Time   Time   Time   In/Sec
Out/Sec
0   0   0   0   -1  -1  0     0     0     0      89     3      0      0
0   0   0   0   -1  -1  0     0     0     0      89     3      0      0
0   0   0   0   -1  -1  0     0     0     0      43     5      202    1346
0   0   0   0   -1  -1  0     0     0     0      54     16     112    4899
0   0   0   0   -1  -1  0     0     0     0      54     16     0      0
0   0   0   0   -1  -1  0     0     0     0      44     10     202    957
0   0   0   0   -1  -1  0     0     0     0      24     10     100    0
0   0   0   0   -1  -1  0     1     0     0      16     83     228    51
0   0   0   0   -1  -1  0     1     0     0      16     83     0      141
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      157
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      7969   83     0      0
0   0   0   0   -1  -1  0     0     0     0      3990   42     0      0
0   0   0   0   -1  -1  0     0     0     0      3990   42     0      0
0   0   0   0   -1  -1  0     1     0     0      3990   48     129    116
0   0   0   0   -1  -1  0     0     0     0      7951   48     0      233
0   0   0   0   -1  -1  0     0     0     0      7951   48     0      0
0   0   0   0   -1  -1  0     0     0     0      7951   48     0      0

Cheers
Sean


"Rod Higgins" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
>
> > "er Error
> > The server encountered a server error and was unable to complete your re
> > your." (sic)
> > Nothing except the restart is recorded in the logs.
> >
> > The jcompress sometimes has to zip 750* 35kb files can exceed 60
seconds,
> > but time-out requests is not set on.
> > The server is a Sunfire V210 with of 2Gb memory on Solaris 9
> > JVM mem max heap 512k
> >
> > Where do I start?
>
> Change to a compression utility that handles exceptions more elegantly. I
> would be surprised if it is a memory size issue, the compression code
> would be using buffered streams set to a certain size (the default 512K
> most likely). Seems more like an IO exception not being handled correctly.
> Any chance of a stack dump on the error?
>
>



---
You are currently subscribed to cfaussie as: [email protected]
To unsubscribe send a blank email to [EMAIL PROTECTED]
Aussie Macromedia Developers: http://lists.daemon.com.au/

Reply via email to