Re: Memory consumption

2003-09-19 Thread J.Pietschmann
Ben Galbraith wrote:
If you like, I can start 
gathering metrics from the JVM itself.  I don't own a commercial 
profiling utility, otherwise I'd run that.
Try
 http://simpleprofiler.sourceforge.net/DrMem/
There's still the possiblity of a leak in the Java RTL itself.
Which JDK are you using?
J.Pietschmann
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-19 Thread Ben Galbraith
J.Pietschmann wrote:
Try
 http://simpleprofiler.sourceforge.net/DrMem/
There's still the possiblity of a leak in the Java RTL itself.
Which JDK are you using?
I'm using JDK 1.4.2 (or 1.4.2_01; I'll have to check).  I just did a 
heap dump (-Xhprof) and found that two byte[] arrays are killing my 
heap, both created by JAI.  I have to run now, but tomorrow I'll have 
the full results of my investigation.

Thanks for the link; I'll check it out.
Ben
J.Pietschmann
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory consumption

2003-09-19 Thread Andreas L. Delmelle
 -Original Message-
 From: Ben Galbraith [mailto:[EMAIL PROTECTED]

 I'm using JDK 1.4.2 (or 1.4.2_01; I'll have to check).  I just did a
 heap dump (-Xhprof) and found that two byte[] arrays are killing my
 heap, both created by JAI.  I have to run now, but tomorrow I'll have
 the full results of my investigation.


Guys,

Since I had a bit of spare-time, I have been using this to do a little
profiling. I must admit though, I had never used one of these before (-but
since you're convinced this isn't that hard, I thought I might as well have
a go :) Sure could do with the exercise. DL'ed some tool called JProfiler...
Pretty neat, apart from the fluffy-XP-WinToys-look -- Aqua forever!! :) )
So, although most of what I'm seeing  reading makes perfect sense to me,
I'm a little stuck in interpreting these results.

So, with the risk of uttering a few silly remarks, a few observations (as
revealed by a heap snapshot taken _after_ :
- the rendering of a single document - the 'extensive.fo' example that comes
with FOP, since svg was mentioned and this example contains an embedded svg
- I've also done tests with the embedded svg example, as well as with slight
modifications of the ExampleFO2PDF program performing the same rendering
(embedded.svg) 15 times and 150 times - I put the loop in the
main-function )

- the charts:

1 Run Before GC - 37671 objects in 1096 classes - 26442 arrays
15 Runs Before GC - 44340 objects in 1109 classes - 29029 arrays
150 Runs Before GC - 43149 in 1109 classes - 28155 arrays

1 Run After GC - 29962 objects in 993 classes - 21542 arrays

(results after gc are similar with 15 runs or 150 runs, in absolute terms)

In all cases (1 run / 15 runs / 150 runs ):

TOP 5 in instances are char[], java.lang.String, int[],
java.util.HashMap$Entry and org.apache.xerces.qni.QName ( the first two
logically tied together because String is actually an object itself with a
value-field of type char[], the fifth because it has two fields - rawname
and localpart - that are of type String )

TOP 5 in size are char[], byte[] and class[], java.lang.String and short[]
The remnants of byte[] on the heap indicating incoming references from
mainly java.util.zip, java.util.jar and the class-arrays.
The class-arrays, I believe, *might* be indications for such a hanging
List/Map/HashMap/Hashtable ( correct? )

Following the number of object instances created during execution ( 15 or
150 runs, you can see the Garbage Collector at work... char[] grows, but
never exceeds 2.6 MB ( total heap never exceeds 5.5 to 6 MB during 150
runs )

Snapshot taken after performing a garbage collect indicates about 600K of
memory being released (2 MB after 15 runs - seems to indicate that there is
indeed an accumulation of some sort taking place, but it remains 2 MB for
150 runs - I guess this has to do more with the automatic GC being performed
only when it becomes necessary), so it does have some result.
After gc, char[] still takes up about +600K ( same amount in all three
testcases )

In short, it seems to me that there's nothing really anomalous. ( The
class-arrays, being referenced by java.util.zip and java.util.jar - which
actually comes down to the same, as jar subclasses zip - seem like they have
to do with the way the VM treats the jars on the classpath... Checking the
instance data reveals lots of these to be referenced by
java.util.jar.Manifest ).

Then again, I may have overlooked something... Feel free to correct me :)

Enjoy!


Andreas Delmelle


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory consumption

2003-09-18 Thread J.Pietschmann
Timo Haberkern wrote:
i have big problems. I use FOP embedded into a java application to 
create approx. 250 PDF Files. In each of the files are 5 or more images 
(JPGs or SVGs). The problem is that FOP consumes more and more memory. 
This seems to indicate a memory leak, however, I wasn't able
to reproduce this. I have a server running FOP rendering dozens
of PDF files per day up for months, without any sign of a memory
leak in FOP.
There are a few suspects in your code:
   Options options = new Options(userConfigFile);
Unless your user config file changes, do this only once.
   InputHandler inputHandler = new XSLTInputHandler(
Try to use a Transformer object instead, as shown in some other
examples.
Nothing helped? Is there no possibility to get rid of this reserverd 
memory?
If you could get a memory profiler and track down which kind of
object locks down the memory, you'll probably earn the inmeasurable
gratitude of the community.
Oh well, FOP before 0.20.4 *does* have memory leaks. Which version
are you using? Which XML parser? Any DTD/schema reference in your
XML? Which XSLT processor? Which JDK? Loong stretches of text
in your transformation result (may cause some static buffers to grow
unreasonably large)?
J.Pietschmann

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-18 Thread J.Pietschmann
Ben Galbraith wrote:
If you haven't figured it out yet, let me inform you: FOP suffers from 
large memory leaks.
Where? Class names?
FOP often uses a huge amount of memory during rendering, but once the
driver object is released, most of it is eglible for GC. Memory locked
accross rendering runs must be referenced by static variables, and a
careful inspection of the code shows there aren't many such things.
Most notably:
- Image cache. No issue if it is reset.
- A few string buffers used as, well, buffers in various places. They
 should grow of the order of text strings in the input, at most of
 the order of total text content in the largest block. Complete analysis
 is still pending.
- Configuration data.
This does not rule out there are hidden leaks elsewhere, perhaps
introduced through the Java run time library. If anybody experiencing
rapid growth of locked down memory after several runs of FOP could
do experiments which kind of FO constructs is likely responsible for
this, or preferably run a memory profiler, this would be of great help
in tracking down the problem. My experiments so far failed to get a hold
on this.
J.Pietschmann

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-18 Thread Ben Galbraith
J.Pietschmann wrote:
 If you haven't figured it out yet, let me inform you: FOP suffers 
from large memory leaks.


 Where? Class names?

I haven't tracked it down that far yet; if I had, I'd submit a patch to 
Bugzilla.

I render hundreds of 4 page, image intensive PDFs over and over again. 
As FOP runs, the amount of memory on the heap it consumes continues a 
linear rise, over and over again, until it hits the limit specified by 
-Xmx, at which point it throws an OutOfMemoryException.

I conclude that the issue is with FOP, as it is the only process 
involved in the loop that can consume an entire 1 GB heap.  Each 
individual process is relatively small, but when I generate so many 
back-to-back, it all adds up.

I have the following FOP code in my loop (Exception handling removed for 
readability):

  FopImageFactory.resetCache();
  File configDir = new File(fopHome, conf);
  File fopConfigFile = new File(configDir, userconfig.xml);
  Options options = new Options(fopConfigFile);
  FileInputStream in = new FileInputStream(fopFile);
  FileOutputStream out = new FileOutputStream(pdfFile);
  Driver driver = new Driver(new InputSource(in), out);
  driver.setRenderer(Driver.RENDER_PDF);
  driver.run();
  in.close();
  out.close();
If I'm doing something wrong, please point it out.  Otherwise, it seems 
pretty obvious to me that FOP is the culprit.  If you like, I can start 
gathering metrics from the JVM itself.  I don't own a commercial 
profiling utility, otherwise I'd run that.

Ben
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-17 Thread Ben Galbraith
Robert,
Robert C. Leif wrote:
Most memory leaks are the result of using pointers. If a language provides a
construct that can replace the use of pointers, then this problem can be
minimized. Java has significant overhead because it checks its dispatching
at run-time rather than at compile time. 
As you may know, there are two types of memory leaks: physical leaks 
resulting from bad pointer handling, and logical leaks which occur as 
described below -- when a program holds on to data longer than it 
should.  The classic Java example is the Map-based cache that grows forever.

Java's GC automatically handles the classic pointer-related memory 
leaks; the logical leaks are nastier beasts to track down and require 
some sort of memory heap snapshot or real-time profiler.  Ada, C, C++, 
C#, VB, Perl, Python, etc. are all susceptible to this type of memory 
leak, just as each of those languages are capable of storing information 
in arrays or array-like structures.

Sadly, our FOP also suffers from one or more logical memory leaks. 
Tracking it down really shouldn't be that hard; someone just needs to 
use the JDK profiling tools, a third-party tool, or their eyes to track 
down where FOP isn't dereferencing the appropriate objects.  It may very 
well be that a List or Map is hanging around someplace.

Ben

Bob Leif
Robert C. Leif, Ph.D.
Email [EMAIL PROTECTED]
-Original Message-
From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 8:22 AM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption

Robert C. Leif wrote:
This memory link probably would not have occurred if you had used Ada.
Many
If *I* had used Ada?  :-)  I've contributed 0.1% of the FOP code (a 
measly patch for CMYK images); don't look @ me!

Let's not get into debates about superior languages; I think time has 
shown the topic to be a morass of flame wars and pointless arguments.

As far as memory leaks go, the notion of a program continuing to 
reference memory unnecessarily is fairly language agnostic.

Ben

of the items that Java internally represents as pointers can be coded as
Ada
generics (templates), which incidentally can be combined with tagged types
(classes)to provide a very flexible form of inheritance. Java is still not
portable, since it requires its own environment and is not an ISO
standard.
Ada is an ISO standard. Even C# would have been a better choice; at least
it
is an ECMA standard and has a decent execution speed.
Bob Leif
Robert C. Leif, Ph.D.
Email [EMAIL PROTECTED]
-Original Message-
From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:28 AM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption

Folks,
If you haven't figured it out yet, let me inform you: FOP suffers from 
large memory leaks.  A memory leak in Java is nothing mysterious; it 
occurs because a program never dereferences objects, which prevents the 
Java garbage collector thread from reclaiming them.  Thus, no matter how 
many times you try to tell the GC thread to collect (with System.gc() 
and other nonsense) the memory will never be reclaimed.

There are only two solutions:
1. Split up FOP generation into discreet jobs, and spawn a new JVM to 
generate each job.  You can get fancy and create a system that uses a 
spawned JVM until it runs out of memory -- use the Runtime objects 
memory methods to check.

2. Fix FOP's memory leak problem.
I've had this on my to-do list to patch in maintenance for some time, 
but frankly, for me it was much cheaper to distribute FOP jobs across 
our network in parallel jobs running on multiple JVMs.  Parallel 
computing, baby.

Ben
Ganesh wrote:

If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !
http://www.javaworld.com/javaworld/javatips/jw-javatip130.html


-Original Message-
From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:12 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption

I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop, And
then perform a garbage collect.
Regards,
dennis.myren
-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]



snip/
I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the

loop...)
snip/
No :-( That doesn't help anything...
Any other

RE: Memory consumption

2003-09-16 Thread Andreas L. Delmelle
 -Original Message-
 From: Timo Haberkern [mailto:[EMAIL PROTECTED]

 Hello list,

 i have big problems. I use FOP embedded into a java application to
 create approx. 250 PDF Files. In each of the files are 5 or more images
 (JPGs or SVGs). The problem is that FOP consumes more and more memory. I
 run FOP like this:

 for ()
 {
 org.apache.fop.apps.Driver driver = new org.apache.fop.apps.Driver();
 driver.setRenderer(org.apache.fop.apps.Driver.RENDER_PDF);

 String strTemp = System.getProperty(java.io.tmpdir)+ELKPUBLISH;


 org.apache.fop.configuration.Configuration.put(baseDir,../publish);
 File userConfigFile = new File(../publish/config.xml);
 Options options = new Options(userConfigFile);



snip/

I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they really
do not depend on the variables changing in the loop, that is... If behaviour
would be what I'm guessing, then these would consume memory - the total of
which would only be released on completion of the loop...)

snip/

 driver = null;

You won't be needing this. Just resetting the Driver should be ok.

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?


Hope it helps.


Greetz,

Andreas Delmelle


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory consumption

2003-09-16 Thread Ganesh


org.apache.fop.image.FopImageFactory.resetCache();

I am a no expert in memory mgmt too. Just wanna add to the remark
already made.
If you reset the cache, the memory has to be recreated for every
invocation. This is time consuming particularly if same set of images
needs to be used in all the 250 PDF. Also try increasing your heap size
of JVM, if that mitigates your problem !

Ganesh

-Original Message-
From: Andreas L. Delmelle [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 6:14 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption


 -Original Message-
 From: Timo Haberkern [mailto:[EMAIL PROTECTED]

 Hello list,

 i have big problems. I use FOP embedded into a java application to 
 create approx. 250 PDF Files. In each of the files are 5 or more 
 images (JPGs or SVGs). The problem is that FOP consumes more and more 
 memory. I run FOP like this:

 for ()
 {
 org.apache.fop.apps.Driver driver = new
org.apache.fop.apps.Driver();
 driver.setRenderer(org.apache.fop.apps.Driver.RENDER_PDF);

 String strTemp = 
 System.getProperty(java.io.tmpdir)+ELKPUBLISH;



org.apache.fop.configuration.Configuration.put(baseDir,../publish);
 File userConfigFile = new File(../publish/config.xml);
 Options options = new Options(userConfigFile);



snip/

I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they
really do not depend on the variables changing in the loop, that is...
If behaviour would be what I'm guessing, then these would consume memory
- the total of which would only be released on completion of the
loop...)

snip/

 driver = null;

You won't be needing this. Just resetting the Driver should be ok.

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?


Hope it helps.


Greetz,

Andreas Delmelle


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


**Disclaimer

Information contained in this E-MAIL being proprietary to Wipro Limited is 
'privileged' and 'confidential' and intended for use only by the individual
 or entity to which it is addressed. You are notified that any use, copying 
or dissemination of the information contained in the E-MAIL in any manner 
whatsoever is strictly prohibited.

***

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory consumption

2003-09-16 Thread Timo Haberkern

snip/
I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they really
do not depend on the variables changing in the loop, that is... If behaviour
would be what I'm guessing, then these would consume memory - the total of
which would only be released on completion of the loop...)
snip/
No :-( That doesn't help anything...
Any other ideas?
   driver = null;
   

You won't be needing this. Just resetting the Driver should be ok.
 

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory
Have you tried the multiple page-sequences tip?
Every PDF File is only 2 pages long. And the memory is consumed for PDFs 
with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds
Timo
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory consumption

2003-09-16 Thread Dennis Myrén
I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop,
And then perform a garbage collect.


Regards,
dennis.myren

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]


snip/

I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they really
do not depend on the variables changing in the loop, that is... If behaviour
would be what I'm guessing, then these would consume memory - the total of
which would only be released on completion of the loop...)

snip/

No :-( That doesn't help anything...

Any other ideas?

driver = null;



You won't be needing this. Just resetting the Driver should be ok.
  

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?

Every PDF File is only 2 pages long. And the memory is consumed for PDFs 
with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds

Timo


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory consumption

2003-09-16 Thread Timo Haberkern

org.apache.fop.image.FopImageFactory.resetCache();
   

I am a no expert in memory mgmt too. Just wanna add to the remark
already made.
If you reset the cache, the memory has to be recreated for every
invocation. This is time consuming particularly if same set of images
needs to be used in all the 250 PDF. Also try increasing your heap size
of JVM, if that mitigates your problem !
I have done that, but the current value is 512 MB and i have the 
problem. You are right. reseting the image cache slows down the 
rendering, but that is not that critical in my case. More critical is 
the memory.

I'm thinking in move the start of the rendering in a seperate process 
(i.e. in a Batch file). I think that would slow down my the rendereing, 
but helps for memory...

Timo
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-16 Thread Timo Haberkern
Aha, and how to release the handles!?! I thought a driver.reset would be 
exactly the release of all resources?

rgds
Timo
Dennis Myrén wrote:
I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop,
And then perform a garbage collect.
Regards,
dennis.myren
-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]

 

snip/
I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they really
do not depend on the variables changing in the loop, that is... If behaviour
would be what I'm guessing, then these would consume memory - the total of
which would only be released on completion of the loop...)
snip/
   

No :-( That doesn't help anything...
Any other ideas?
 

  driver = null;
  

 

You won't be needing this. Just resetting the Driver should be ok.
   

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...

 

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory
Have you tried the multiple page-sequences tip?
   

Every PDF File is only 2 pages long. And the memory is consumed for PDFs 
with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds
Timo
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
.
 


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory consumption

2003-09-16 Thread Ganesh


If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !

http://www.javaworld.com/javaworld/javatips/jw-javatip130.html

 



-Original Message-
From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:12 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption


I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop, And
then perform a garbage collect.


Regards,
dennis.myren

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]


snip/

I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the

loop...)

snip/

No :-( That doesn't help anything...

Any other ideas?

driver = null;



You won't be needing this. Just resetting the Driver should be ok.
  

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...

I also notice you have read this (?) 
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?

Every PDF File is only 2 pages long. And the memory is consumed for PDFs

with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds

Timo


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


**Disclaimer

Information contained in this E-MAIL being proprietary to Wipro Limited is 
'privileged' and 'confidential' and intended for use only by the individual
 or entity to which it is addressed. You are notified that any use, copying 
or dissemination of the information contained in the E-MAIL in any manner 
whatsoever is strictly prohibited.

***

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory consumption

2003-09-16 Thread Dennis Myrén
I mean by assigning null values to the instances and then to the System.gc()

Regards dennis

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:49
To: [EMAIL PROTECTED]

Aha, and how to release the handles!?! I thought a driver.reset would be 
exactly the release of all resources?

rgds
Timo

Dennis Myrén wrote:

I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop,
And then perform a garbage collect.


Regards,
dennis.myren

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]


  

snip/

I'm not really the expert on JVM Memory Management, but AFAICT these
declarations belong outside the 'for'-loop. (Not sure whether this is
causing memory problems, but it just seems ... more elegant. If they really
do not depend on the variables changing in the loop, that is... If behaviour
would be what I'm guessing, then these would consume memory - the total of
which would only be released on completion of the loop...)

snip/



No :-( That doesn't help anything...

Any other ideas?

  

   driver = null;
   

  

You won't be needing this. Just resetting the Driver should be ok.
 



mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...

  

I also notice you have read this (?)
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?



Every PDF File is only 2 pages long. And the memory is consumed for PDFs 
with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds

Timo


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

.

  




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory consumption

2003-09-16 Thread Timo Haberkern
That doesn't help too :-( The FOP classes don't come down of their high 
memory consumption

Timo
Ganesh wrote:
If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !
http://www.javaworld.com/javaworld/javatips/jw-javatip130.html
 


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Memory consumption

2003-09-16 Thread Timo Haberkern
But i have done that as you can see in my initial code snippet. Or am i 
wrong?

Timo
Dennis Myrén wrote:
I mean by assigning null values to the instances and then to the System.gc()
Regards dennis
-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:49
To: [EMAIL PROTECTED]

Aha, and how to release the handles!?! I thought a driver.reset would be 
exactly the release of all resources?

rgds
Timo
Dennis Myrén wrote:
 


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory consumption

2003-09-16 Thread Ganesh


Try using images of small size, may think of alternative which uses high
compression-ratio jpg files !

Can you let the group know what exactly you mean by high memory
consumption (in tangible quantities)? Probably some one can validate
whether it is acceptably high or not.
 

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:49 PM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption


That doesn't help too :-( The FOP classes don't come down of their high 
memory consumption

Timo

Ganesh wrote:

If you can afford the gc time consumption then there is a sure way of 
garbage collection. This method will ensure that garbage is collected 
for sure...Use the Sizeof class as given in the java world article 
below. This is a sure way of garbage collection, but then it slows down

the system !

http://www.javaworld.com/javaworld/javatips/jw-javatip130.html
  



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


**Disclaimer

Information contained in this E-MAIL being proprietary to Wipro Limited is 
'privileged' and 'confidential' and intended for use only by the individual
 or entity to which it is addressed. You are notified that any use, copying 
or dissemination of the information contained in the E-MAIL in any manner 
whatsoever is strictly prohibited.

***

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory consumption

2003-09-16 Thread Ben Galbraith
Folks,
If you haven't figured it out yet, let me inform you: FOP suffers from 
large memory leaks.  A memory leak in Java is nothing mysterious; it 
occurs because a program never dereferences objects, which prevents the 
Java garbage collector thread from reclaiming them.  Thus, no matter how 
many times you try to tell the GC thread to collect (with System.gc() 
and other nonsense) the memory will never be reclaimed.

There are only two solutions:
1. Split up FOP generation into discreet jobs, and spawn a new JVM to 
generate each job.  You can get fancy and create a system that uses a 
spawned JVM until it runs out of memory -- use the Runtime objects 
memory methods to check.

2. Fix FOP's memory leak problem.
I've had this on my to-do list to patch in maintenance for some time, 
but frankly, for me it was much cheaper to distribute FOP jobs across 
our network in parallel jobs running on multiple JVMs.  Parallel 
computing, baby.

Ben
Ganesh wrote:
If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !
http://www.javaworld.com/javaworld/javatips/jw-javatip130.html
 


-Original Message-
From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:12 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption

I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop, And
then perform a garbage collect.
Regards,
dennis.myren
-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]


snip/
I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the

loop...)
snip/
No :-( That doesn't help anything...
Any other ideas?

  driver = null;
  

You won't be needing this. Just resetting the Driver should be ok.

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...


I also notice you have read this (?) 
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?
Every PDF File is only 2 pages long. And the memory is consumed for PDFs
with big images...
My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds
Timo
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
**Disclaimer
Information contained in this E-MAIL being proprietary to Wipro Limited is 
'privileged' and 'confidential' and intended for use only by the individual
 or entity to which it is addressed. You are notified that any use, copying 
or dissemination of the information contained in the E-MAIL in any manner 
whatsoever is strictly prohibited.

***
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory consumption

2003-09-16 Thread Dennis Myrén
Sorry.
I didnt see that.
In that case, I have no other advices, but maybe to use the command line tools 
instead.
I have myself setup a 256MB 1 GHZ CPU Windows computer to perform generation of 
50 PDF documents in one time using the command line tools that come with FOP, 
often with more than 10 000 lines in each file. Works great.

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 16:20
To: [EMAIL PROTECTED]

But i have done that as you can see in my initial code snippet. Or am i 
wrong?

Timo

Dennis Myrén wrote:

I mean by assigning null values to the instances and then to the System.gc()

Regards dennis

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:49
To: [EMAIL PROTECTED]

Aha, and how to release the handles!?! I thought a driver.reset would be 
exactly the release of all resources?

rgds
Timo

Dennis Myrén wrote:

  




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory consumption

2003-09-16 Thread Robert C. Leif
This memory link probably would not have occurred if you had used Ada. Many
of the items that Java internally represents as pointers can be coded as Ada
generics (templates), which incidentally can be combined with tagged types
(classes)to provide a very flexible form of inheritance. Java is still not
portable, since it requires its own environment and is not an ISO standard.
Ada is an ISO standard. Even C# would have been a better choice; at least it
is an ECMA standard and has a decent execution speed.
Bob Leif
Robert C. Leif, Ph.D.
Email [EMAIL PROTECTED]

-Original Message-
From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:28 AM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption

Folks,

If you haven't figured it out yet, let me inform you: FOP suffers from 
large memory leaks.  A memory leak in Java is nothing mysterious; it 
occurs because a program never dereferences objects, which prevents the 
Java garbage collector thread from reclaiming them.  Thus, no matter how 
many times you try to tell the GC thread to collect (with System.gc() 
and other nonsense) the memory will never be reclaimed.

There are only two solutions:

1. Split up FOP generation into discreet jobs, and spawn a new JVM to 
generate each job.  You can get fancy and create a system that uses a 
spawned JVM until it runs out of memory -- use the Runtime objects 
memory methods to check.

2. Fix FOP's memory leak problem.

I've had this on my to-do list to patch in maintenance for some time, 
but frankly, for me it was much cheaper to distribute FOP jobs across 
our network in parallel jobs running on multiple JVMs.  Parallel 
computing, baby.

Ben

Ganesh wrote:
 
 If you can afford the gc time consumption then there is a sure way of
 garbage collection. This method will ensure that garbage is collected
 for sure...Use the Sizeof class as given in the java world article
 below. This is a sure way of garbage collection, but then it slows down
 the system !
 
 http://www.javaworld.com/javaworld/javatips/jw-javatip130.html
 
  
 
 
 
 -Original Message-
 From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
 Sent: Tuesday, September 16, 2003 7:12 PM
 To: [EMAIL PROTECTED]
 Subject: RE: Memory consumption
 
 
 I am not an expert in java memory handling either,
 But I suggest you release all handles after each run in the loop, And
 then perform a garbage collect.
 
 
 Regards,
 dennis.myren
 
 -Original Message-
 From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
 Sent: 16. september 2003 15:38
 To: [EMAIL PROTECTED]
 
 
 
snip/

I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the
 
 
loop...)

snip/

 No :-( That doesn't help anything...
 
 Any other ideas?
 
 
   driver = null;
   


You won't be needing this. Just resetting the Driver should be ok.
 

 
 mhmm, that was i try! I thought that it maybe helps a little bit but it 
 doesn't. But it remains anyhow...
 
 
I also notice you have read this (?) 
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?

 
 Every PDF File is only 2 pages long. And the memory is consumed for PDFs
 
 with big images...
 
 My problem is that i can't get down the memory after rendering one PDF 
 and before the next rendering...
 
 regds
 
 Timo
 
 
 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]
 
 
 
 
 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]
 
 
 **Disclaimer
 
 Information contained in this E-MAIL being proprietary to Wipro Limited is

 'privileged' and 'confidential' and intended for use only by the
individual
  or entity to which it is addressed. You are notified that any use,
copying 
 or dissemination of the information contained in the E-MAIL in any manner 
 whatsoever is strictly prohibited.
 

***
 
 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]
 
 


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands

Re: Memory consumption

2003-09-16 Thread Ben Galbraith
Robert C. Leif wrote:
This memory link probably would not have occurred if you had used Ada. Many
If *I* had used Ada?  :-)  I've contributed 0.1% of the FOP code (a 
measly patch for CMYK images); don't look @ me!

Let's not get into debates about superior languages; I think time has 
shown the topic to be a morass of flame wars and pointless arguments.

As far as memory leaks go, the notion of a program continuing to 
reference memory unnecessarily is fairly language agnostic.

Ben
of the items that Java internally represents as pointers can be coded as Ada
generics (templates), which incidentally can be combined with tagged types
(classes)to provide a very flexible form of inheritance. Java is still not
portable, since it requires its own environment and is not an ISO standard.
Ada is an ISO standard. Even C# would have been a better choice; at least it
is an ECMA standard and has a decent execution speed.
Bob Leif
Robert C. Leif, Ph.D.
Email [EMAIL PROTECTED]
-Original Message-
From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:28 AM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption

Folks,
If you haven't figured it out yet, let me inform you: FOP suffers from 
large memory leaks.  A memory leak in Java is nothing mysterious; it 
occurs because a program never dereferences objects, which prevents the 
Java garbage collector thread from reclaiming them.  Thus, no matter how 
many times you try to tell the GC thread to collect (with System.gc() 
and other nonsense) the memory will never be reclaimed.

There are only two solutions:
1. Split up FOP generation into discreet jobs, and spawn a new JVM to 
generate each job.  You can get fancy and create a system that uses a 
spawned JVM until it runs out of memory -- use the Runtime objects 
memory methods to check.

2. Fix FOP's memory leak problem.
I've had this on my to-do list to patch in maintenance for some time, 
but frankly, for me it was much cheaper to distribute FOP jobs across 
our network in parallel jobs running on multiple JVMs.  Parallel 
computing, baby.

Ben
Ganesh wrote:
If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !
http://www.javaworld.com/javaworld/javatips/jw-javatip130.html


-Original Message-
From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:12 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption

I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop, And
then perform a garbage collect.
Regards,
dennis.myren
-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]



snip/
I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the

loop...)
snip/
No :-( That doesn't help anything...
Any other ideas?

 driver = null;
 

You won't be needing this. Just resetting the Driver should be ok.

mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...


I also notice you have read this (?) 
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?
Every PDF File is only 2 pages long. And the memory is consumed for PDFs
with big images...
My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds
Timo
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
**Disclaimer
Information contained in this E-MAIL being proprietary to Wipro Limited is

'privileged' and 'confidential' and intended for use only by the
individual
or entity to which it is addressed. You are notified that any use,
copying 

or dissemination of the information contained in the E-MAIL in any manner 
whatsoever is strictly prohibited.


***
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED

RE: Memory consumption

2003-09-16 Thread Robert C. Leif
Most memory leaks are the result of using pointers. If a language provides a
construct that can replace the use of pointers, then this problem can be
minimized. Java has significant overhead because it checks its dispatching
at run-time rather than at compile time. 
Bob Leif
Robert C. Leif, Ph.D.
Email [EMAIL PROTECTED]

-Original Message-
From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 8:22 AM
To: [EMAIL PROTECTED]
Subject: Re: Memory consumption

Robert C. Leif wrote:
 This memory link probably would not have occurred if you had used Ada.
Many

If *I* had used Ada?  :-)  I've contributed 0.1% of the FOP code (a 
measly patch for CMYK images); don't look @ me!

Let's not get into debates about superior languages; I think time has 
shown the topic to be a morass of flame wars and pointless arguments.

As far as memory leaks go, the notion of a program continuing to 
reference memory unnecessarily is fairly language agnostic.

Ben

 of the items that Java internally represents as pointers can be coded as
Ada
 generics (templates), which incidentally can be combined with tagged types
 (classes)to provide a very flexible form of inheritance. Java is still not
 portable, since it requires its own environment and is not an ISO
standard.
 Ada is an ISO standard. Even C# would have been a better choice; at least
it
 is an ECMA standard and has a decent execution speed.
 Bob Leif
 Robert C. Leif, Ph.D.
 Email [EMAIL PROTECTED]
 
 -Original Message-
 From: Ben Galbraith [mailto:[EMAIL PROTECTED] 
 Sent: Tuesday, September 16, 2003 7:28 AM
 To: [EMAIL PROTECTED]
 Subject: Re: Memory consumption
 
 Folks,
 
 If you haven't figured it out yet, let me inform you: FOP suffers from 
 large memory leaks.  A memory leak in Java is nothing mysterious; it 
 occurs because a program never dereferences objects, which prevents the 
 Java garbage collector thread from reclaiming them.  Thus, no matter how 
 many times you try to tell the GC thread to collect (with System.gc() 
 and other nonsense) the memory will never be reclaimed.
 
 There are only two solutions:
 
 1. Split up FOP generation into discreet jobs, and spawn a new JVM to 
 generate each job.  You can get fancy and create a system that uses a 
 spawned JVM until it runs out of memory -- use the Runtime objects 
 memory methods to check.
 
 2. Fix FOP's memory leak problem.
 
 I've had this on my to-do list to patch in maintenance for some time, 
 but frankly, for me it was much cheaper to distribute FOP jobs across 
 our network in parallel jobs running on multiple JVMs.  Parallel 
 computing, baby.
 
 Ben
 
 Ganesh wrote:
 
If you can afford the gc time consumption then there is a sure way of
garbage collection. This method will ensure that garbage is collected
for sure...Use the Sizeof class as given in the java world article
below. This is a sure way of garbage collection, but then it slows down
the system !

http://www.javaworld.com/javaworld/javatips/jw-javatip130.html

 



-Original Message-
From: Dennis Myrén [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, September 16, 2003 7:12 PM
To: [EMAIL PROTECTED]
Subject: RE: Memory consumption


I am not an expert in java memory handling either,
But I suggest you release all handles after each run in the loop, And
then perform a garbage collect.


Regards,
dennis.myren

-Original Message-
From: Timo Haberkern [mailto:[EMAIL PROTECTED] 
Sent: 16. september 2003 15:38
To: [EMAIL PROTECTED]




snip/

I'm not really the expert on JVM Memory Management, but AFAICT these 
declarations belong outside the 'for'-loop. (Not sure whether this is 
causing memory problems, but it just seems ... more elegant. If they 
really do not depend on the variables changing in the loop, that is... 
If behaviour would be what I'm guessing, then these would consume 
memory - the total of which would only be released on completion of the


loop...)

snip/

No :-( That doesn't help anything...

Any other ideas?



  driver = null;
  


You won't be needing this. Just resetting the Driver should be ok.



mhmm, that was i try! I thought that it maybe helps a little bit but it 
doesn't. But it remains anyhow...



I also notice you have read this (?) 
http://xml.apache.org/fop/running.html#memory

Have you tried the multiple page-sequences tip?


Every PDF File is only 2 pages long. And the memory is consumed for PDFs

with big images...

My problem is that i can't get down the memory after rendering one PDF 
and before the next rendering...

regds

Timo


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


**Disclaimer

Information

Re: Memory consumption with large images in FOP

2002-05-02 Thread Jeff_Mitchell

J.Pietschmann-

Thanks for your help.  I think I may be running out of memory when viewing
the PDF.  I'll do some playing around on my end, but your explanations
should help a lot.

Thanks again,

-Jeff



  
J.Pietschman   
  
n   To: [EMAIL PROTECTED]  

[EMAIL PROTECTED]cc:   
   
oo.de   Subject: Re: Memory consumption 
with large images in FOP 

  
05/01/2002  
  
06:00 PM
  
Please  
  
respond to  
  
fop-user
  

  

  




[EMAIL PROTECTED] wrote:
 the resulting PDF has a black
 rectangle, where the image should be.

In PDF, images are rendered at 1/72 inch per pixel, or roughly
3.53 cm per 100 pixel. Higher resolution images are scaled down
during rendering, you may see a resampling artifact, or simply
a bug (probably in the PDF viewer). Be aware that the PDF viewer
will have to decompress your 4k*5k pixel images at 3 bytes per
pixel, resulting in allocating nearly 60 MB or more.
It is also possible that you run into an arithmetic overflow
or some similar problem.

 Is there a
 limit to the resolution of raster images FOP (or PDFs in general) can
 handle, or this perhaps a problem with memory?

In theory, there is no limit, at least not directly imposed by FOP.
FOP will, however, hold the file in memory, decompressed for some
formats.

 It appears the images contained within a PDF are compressed.  Does anyone
 know if this compression is JPEG,  and if so, does FOP just dump a given
 JPEG into the PDF file, or does it uncompress the original JPEG in
memory,
 and then recompress it to go into the PDF?

For JPEG images, the file is dumped into the PDF basically unchanged.

 Finally, I haven't found a definitive list of the image types allowed in
an
 external-graphic src attribute.  Does anyone know where I might find such
a
 list.


 From the sources: GIF, BMP, EPS and JPEG are supported natively, in
part through standard Java mechanisms. There are not necessarily all
subformats supported (BMP has at least a dozen, some very obscure,
and there are a few exotic JPEG subformats as well).
SVG is supported through Batik.
FOP can take advantage of the Jimi image library and of JAI. Jimi
is no longer distributed with FOP because of license reasons, you
can get it yourself (there are instructions in the docs), but for
some odd reasons it may be possible that you'll have to rebuild
FOP from the sources, even though this is very easy nowadays.
Jimi supports:  AFP (dunno), BMP, CUR (MS Windows cursor?), GIF,
ICO (Windows icons), PCX, PICT (dunno), PNG, PSD (dunno), Sun
raster (used by Sun), TGA, TIFF and XBM (X Windows bitmap).
JAI is an interface specification, FOP will support whatever your
JAI conformant implementation supports if you drop it in (may
require rebuild).
This list is not authoritative, apply usual disclaimers.

J.Pietschmann