RE: Out of Memory Error from Java

2009-01-26 Thread Steffanina, Jeff
David,
Thanks for the additional input.   As your excellent article implies,
the solution is not as easy to implement as the concept implies.   

Specifically, my region-body is primarily a single, four-column, table
that is eventually followed by some summary text.  When the table
contains more than 4,500 lines I get the out-of-memory error.
Currently, the entire invoice is one page-sequence.  Your approach
appears to solve that problem and will keep the paging in tact.  My
problem is finding a way to create multiple page-seq inside a single
table without ruining the spacing.

I am working on that now.

Thanks..Any additional thoughts/examples would be greatly
appreciated.
Here is my code.

LINE 64 =   fo:page-sequence master-reference=multi
...
...
...
...
LINE 335 = 
 fo:table table-layout=fixed line-height=.10in
white-space-collapse=false width=100%
border-collapse=collapse
fox:orphan-content-limit=1in
fo:table-column column-width=.02in/
fo:table-column column-width=1.35in/
fo:table-column column-width=2.9in/
fo:table-column column-width=.69in/
fo:table-column column-width=.80in/

fo:table-body
  xsl:attribute
name=white-space-collapsefalse/xsl:attribute
  xsl:apply-templates select=./detail-line/
  xsl:apply-templates select=./total-line/
/fo:table-body

 /fo:table


.
.


LINE 479 = /fo:page-sequence

Jeff 


-Original Message-
From: DavidJKelly [mailto:dke...@scriptorium.com] 
Sent: Friday, January 23, 2009 8:24 AM
To: fop-users@xmlgraphics.apache.org
Subject: RE: Out of Memory Error from Java


At the risk of providing a redundant answer, I would also like to point
out a
paper I wrote on a specific method for creating multiple page sequences
in
the DITA Open Toolkit.  It provides code examples and explanations for
some
of the gotchas in using multiple page sequences.  There may be
information
in this approach that you or others in this community would find
helpful.

http://www.scriptorium.com/whitepapers/xslfo/index.html

Regards,
David Kelly



Steffanina, Jeff wrote:
 
 
 I read the article that you referenced below.   It suggests using
 multiple page sequences.  Currently, I have only one.
 
 I will make that change and test it.
 
 THANKS for the info!
 
 
 
 Jeff 
 
 
 
 
 
 -Original Message-
 From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
 Sent: Thursday, January 22, 2009 4:11 AM
 To: fop-users@xmlgraphics.apache.org
 Subject: Re: Out of Memory Error from Java
 
 Steffanina, Jeff wrote:
 
 I have:
 FOP 0.95
 Linux
 Java 1.5
 My Java starts with memory set as:  -Xmx192m
 
 I can print an unlimited number of individual invoices in a single
 Java 
 process (i.e. the user chooses the option to print).  For example,
 print 
 2,000 invoices where each  invoice contains 100 lines of details.  
 Therefore, print a total of 200,000 lines of detail.
 
 However, I receive an Out of Memory error from Java when I attempt
 to 
 print a single invoice  that contains more than 4,500 lines of
detail.
 
 Have you read:
 http://xmlgraphics.apache.org/fop/0.94/running.html#memory
 
 Specifically the part about breaking up the FO into multiple page
 sequences.
 
 
 Other than continuing to increase the amount of memory at startup, is

 there something I can do to prevent the error when printing a single,

 large invoice? 
 
 Regards,
 
 Chris
 
 
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 
 

-- 
View this message in context:
http://www.nabble.com/Out-of-Memory-Error-from-Java-tp21591957p21624377.
html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory Error from Java

2009-01-23 Thread DavidJKelly

At the risk of providing a redundant answer, I would also like to point out a
paper I wrote on a specific method for creating multiple page sequences in
the DITA Open Toolkit.  It provides code examples and explanations for some
of the gotchas in using multiple page sequences.  There may be information
in this approach that you or others in this community would find helpful.

http://www.scriptorium.com/whitepapers/xslfo/index.html

Regards,
David Kelly



Steffanina, Jeff wrote:
 
 
 I read the article that you referenced below.   It suggests using
 multiple page sequences.  Currently, I have only one.
 
 I will make that change and test it.
 
 THANKS for the info!
 
 
 
 Jeff 
 
 
 
 
 
 -Original Message-
 From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
 Sent: Thursday, January 22, 2009 4:11 AM
 To: fop-users@xmlgraphics.apache.org
 Subject: Re: Out of Memory Error from Java
 
 Steffanina, Jeff wrote:
 
 I have:
 FOP 0.95
 Linux
 Java 1.5
 My Java starts with memory set as:  -Xmx192m
 
 I can print an unlimited number of individual invoices in a single
 Java 
 process (i.e. the user chooses the option to print).  For example,
 print 
 2,000 invoices where each  invoice contains 100 lines of details.  
 Therefore, print a total of 200,000 lines of detail.
 
 However, I receive an Out of Memory error from Java when I attempt
 to 
 print a single invoice  that contains more than 4,500 lines of detail.
 
 Have you read:
 http://xmlgraphics.apache.org/fop/0.94/running.html#memory
 
 Specifically the part about breaking up the FO into multiple page
 sequences.
 
 
 Other than continuing to increase the amount of memory at startup, is 
 there something I can do to prevent the error when printing a single, 
 large invoice? 
 
 Regards,
 
 Chris
 
 
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-Error-from-Java-tp21591957p21624377.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out of Memory Error from Java

2009-01-22 Thread Chris Bowditch

Steffanina, Jeff wrote:


I have:
FOP 0.95
Linux
Java 1.5
My Java starts with memory set as:  -Xmx192m

I can print an unlimited number of individual invoices in a single Java 
process (i.e. the user chooses the option to print).  For example, print 
2,000 invoices where each  invoice contains 100 lines of details.  
Therefore, print a total of 200,000 lines of detail.


However, I receive an Out of Memory error from Java when I attempt to 
print a single invoice  that contains more than 4,500 lines of detail.


Have you read: http://xmlgraphics.apache.org/fop/0.94/running.html#memory

Specifically the part about breaking up the FO into multiple page sequences.



Other than continuing to increase the amount of memory at startup, is 
there something I can do to prevent the error when printing a single, 
large invoice? 


Regards,

Chris



-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory Error from Java

2009-01-22 Thread Steffanina, Jeff

I read the article that you referenced below.   It suggests using
multiple page sequences.  Currently, I have only one.

I will make that change and test it.

THANKS for the info!



Jeff 





-Original Message-
From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
Sent: Thursday, January 22, 2009 4:11 AM
To: fop-users@xmlgraphics.apache.org
Subject: Re: Out of Memory Error from Java

Steffanina, Jeff wrote:

 I have:
 FOP 0.95
 Linux
 Java 1.5
 My Java starts with memory set as:  -Xmx192m
 
 I can print an unlimited number of individual invoices in a single
Java 
 process (i.e. the user chooses the option to print).  For example,
print 
 2,000 invoices where each  invoice contains 100 lines of details.  
 Therefore, print a total of 200,000 lines of detail.
 
 However, I receive an Out of Memory error from Java when I attempt
to 
 print a single invoice  that contains more than 4,500 lines of detail.

Have you read:
http://xmlgraphics.apache.org/fop/0.94/running.html#memory

Specifically the part about breaking up the FO into multiple page
sequences.

 
 Other than continuing to increase the amount of memory at startup, is 
 there something I can do to prevent the error when printing a single, 
 large invoice? 

Regards,

Chris



-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out of Memory error

2008-12-17 Thread vsyamala

Thank you Sean and Andreas. I will look into those possibilities.

Vsyamala


Andreas Delmelle-2 wrote:
 
 On 12 Dec 2008, at 15:28, Griffin,Sean wrote:
 
 Hi Sean, Vsyamala,
 
 I was assuming you were embedding FOP into your application and  
 calling directly through Java.  Since it appears you're launching  
 from the command-line, it's up to the FOP command-line program on  
 how it streams the output.  Since you're specifying file names as  
 your I/O, it stands to reason that you'd be reading from/writing to  
 File I/O streams, so the use of a ByteArrayOutputStream in the  
 PDFStream.add() method might be a red herring and perfectly normal.
 
 Indeed. In general, a stack trace is next to useless for an OOMError.  
 It will only tell you at what point in the process the error occurred,  
 but it doesn't really say anything about the actual cause.
 If the error were to occur during a StringBuffer.append(), chances  
 would be very slim that it's actually the StringBuffer that is  
 responsible. It just means that append() needed to allocate a few  
 extra bytes of memory, but other objects/code already used up the  
 available heap space.
 
 In this case, the PDFStream.add() method is FOP-internal, and IIC, no  
 cause for concern. With a FO file of 45 MB, I'm suspecting the issue  
 is caused by the fact that the entire document consists of a single  
 fo:page-sequence, which is a very well-known limitation at the moment.  
 If that is the case, the only workaround would be to introduce more  
 structure in the document, so that you get sequences of 10-15 pages  
 max. Well, even 40-50 pages should work nicely, and with 2GB of heap,  
 I guess this could even be a lot more...
 
 
 Cheers
 
 Andreas
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 
 

-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-error-tp20962360p21063152.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory error

2008-12-12 Thread vsyamala

Hi Sean,

Thanks for responding. I am using FOP 0.95 version and generating .fo file
from XSL transformation using xsltproc processor:

xsltproc --nonet --novalid --output Part5.fo PDFPublish_External.xsl
PDF_Book_Part5.xml

And then using Fop 0.95 version to convert .fo file to .pdf

call Fop -q -r -c fop.xconf Part5.fo PDF_Part5.pdf

I am not sure whether I am using memory stream or file stream. How do I
check that? How should I use the FileInputStream method or write to a file
instead of disk?

Thanks,
Vsyamala



Griffin,Sean wrote:
 
 Vsyamala,
 There are a variety of factors that could contribute to your OOM error,
 including, most significantly, the size of your page sequences within you
 FO, the types of images that you're embedding, the version of FOP that
 you're using, and whether you're writing to the PDF in memory or into a
 file.  300 images @ 48KB each doesn't sound extreme, but a 45MB FO file is
 pretty big.  It's too hard to say whether a 45MB FO file with 300 images
 at 48KB each should be able to run in 2GB memory...there's just too many
 variables that aren't mentioned.  So I would recommend a
 divide-and-conquer approach.  Remove the image references from the FO and
 try again.  Still run out of memory or come close to it?  If so, it must
 not be the images.  Different versions of FOP handle images in different
 ways.  For example, in v0.20.5 (maybe 0.95 as well), JPG images were
 directly embedded into the PDF whereas PNG images were transcoded into
 bitmaps and compressed into the PDF.  As a consequence, JPG images used a
 lot less processor power but created, generally, larger PDFs.
 
 I just noticed from your stack trace that it appears you're writing to a
 ByteArrayOutputStream.  That could be a big problem right there.  You're
 wanting to store the entire 12+MB PDF in memory in addition to the memory
 needed for FOP to function?  A much better option would be to write that
 file to disk.  Likewise, are you feeding the FO into FOP from a memory
 stream, a file stream, or as the result of an XML/XSLT transformation?  Of
 all options, if you're rendering files this big, you should use the
 FileInputStream method.
 
 Sean
 
 -Original Message-
 From: vsyamala [mailto:vsyam...@gmail.com] 
 Sent: Thursday, December 11, 2008 1:10 PM
 To: fop-users@xmlgraphics.apache.org
 Subject: Out of Memory error
 
 
 Hi,
 
 I am trying to generate PDF from 45MB .fo file, fop produces a pdf of
 about
 12MB and it fails with out of memory errors. XMx and XMs options are set
 for 2048MB(that's max I could go on the machine). And this .fo file is
 referring to about 300 images of appx 48KB each. FOP should be able to
 generate PDF. I am not sure if images are causing this issue. Does anyone
 know if images are the issue? Here is the error:
 
 Exception in thread main java.lang.OutOfMemoryError: Java heap space
 at java.util.Arrays.copyOf(Unknown Source)
 at java.io.ByteArrayOutputStream.write(Unknown Source)
 at java.io.OutputStream.write(Unknown Source)
 at org.apache.fop.pdf.PDFStream.add(PDFStream.java:60)
 at
 org.apache.fop.render.pdf.PDFRenderer.concatenateTransformationMatrix
 (PDFRenderer.java:839)
 at
 org.apache.fop.render.AbstractPathOrientedRenderer.renderReferenceAre
 a(AbstractPathOrientedRenderer.java:539)
 at
 org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
 ava:560)
 at
 org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
 29)
 at
 org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
 java:526)
 at
 org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
 ava:573)
 at
 org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
 29)
 at
 org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
 java:526)
 
 Thanks,
 Vsyamala
 
  
 -- 
 View this message in context:
 http://www.nabble.com/Out-of-Memory-error-tp20962360p20962360.html
 Sent from the FOP - Users mailing list archive at Nabble.com.
 
 
 -
 To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
 
 --
 CONFIDENTIALITY NOTICE This message and any included attachments are from
 Cerner Corporation and are intended only for the addressee. The
 information contained in this message is confidential and may constitute
 inside or non-public information under international, federal, or state
 securities laws. Unauthorized forwarding, printing, copying, distribution,
 or use of such information is strictly prohibited and may be unlawful. If
 you are not the addressee, please promptly delete this message and notify
 the sender of the delivery error by e-mail or you may call Cerner's
 corporate offices in Kansas City, 

RE: Out of Memory error

2008-12-12 Thread Griffin,Sean
Vsyamala,
I was assuming you were embedding FOP into your application and calling 
directly through Java.  Since it appears you're launching from the 
command-line, it's up to the FOP command-line program on how it streams the 
output.  Since you're specifying file names as your I/O, it stands to reason 
that you'd be reading from/writing to File I/O streams, so the use of a 
ByteArrayOutputStream in the PDFStream.add() method might be a red herring and 
perfectly normal.

If you haven't already, you'll want to read this, which may point you in a 
direction to finding the source of your problem: 
http://xmlgraphics.apache.org/fop/0.94/running.html#memory.

Sean


-Original Message-
From: vsyamala [mailto:vsyam...@gmail.com] 
Sent: Friday, December 12, 2008 8:12 AM
To: fop-users@xmlgraphics.apache.org
Subject: RE: Out of Memory error


Hi Sean,

Thanks for responding. I am using FOP 0.95 version and generating .fo file
from XSL transformation using xsltproc processor:

xsltproc --nonet --novalid --output Part5.fo PDFPublish_External.xsl
PDF_Book_Part5.xml

And then using Fop 0.95 version to convert .fo file to .pdf

call Fop -q -r -c fop.xconf Part5.fo PDF_Part5.pdf

I am not sure whether I am using memory stream or file stream. How do I
check that? How should I use the FileInputStream method or write to a file
instead of disk?

Thanks,
Vsyamala

--
CONFIDENTIALITY NOTICE This message and any included attachments are from 
Cerner Corporation and are intended only for the addressee. The information 
contained in this message is confidential and may constitute inside or 
non-public information under international, federal, or state securities laws. 
Unauthorized forwarding, printing, copying, distribution, or use of such 
information is strictly prohibited and may be unlawful. If you are not the 
addressee, please promptly delete this message and notify the sender of the 
delivery error by e-mail or you may call Cerner's corporate offices in Kansas 
City, Missouri, U.S.A at (+1) (816)221-1024.

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out of Memory error

2008-12-12 Thread Andreas Delmelle

On 12 Dec 2008, at 15:28, Griffin,Sean wrote:

Hi Sean, Vsyamala,

I was assuming you were embedding FOP into your application and  
calling directly through Java.  Since it appears you're launching  
from the command-line, it's up to the FOP command-line program on  
how it streams the output.  Since you're specifying file names as  
your I/O, it stands to reason that you'd be reading from/writing to  
File I/O streams, so the use of a ByteArrayOutputStream in the  
PDFStream.add() method might be a red herring and perfectly normal.


Indeed. In general, a stack trace is next to useless for an OOMError.  
It will only tell you at what point in the process the error occurred,  
but it doesn't really say anything about the actual cause.
If the error were to occur during a StringBuffer.append(), chances  
would be very slim that it's actually the StringBuffer that is  
responsible. It just means that append() needed to allocate a few  
extra bytes of memory, but other objects/code already used up the  
available heap space.


In this case, the PDFStream.add() method is FOP-internal, and IIC, no  
cause for concern. With a FO file of 45 MB, I'm suspecting the issue  
is caused by the fact that the entire document consists of a single  
fo:page-sequence, which is a very well-known limitation at the moment.  
If that is the case, the only workaround would be to introduce more  
structure in the document, so that you get sequences of 10-15 pages  
max. Well, even 40-50 pages should work nicely, and with 2GB of heap,  
I guess this could even be a lot more...



Cheers

Andreas

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory error

2008-12-11 Thread Griffin,Sean
Vsyamala,
There are a variety of factors that could contribute to your OOM error, 
including, most significantly, the size of your page sequences within you FO, 
the types of images that you're embedding, the version of FOP that you're 
using, and whether you're writing to the PDF in memory or into a file.  300 
images @ 48KB each doesn't sound extreme, but a 45MB FO file is pretty big.  
It's too hard to say whether a 45MB FO file with 300 images at 48KB each should 
be able to run in 2GB memory...there's just too many variables that aren't 
mentioned.  So I would recommend a divide-and-conquer approach.  Remove the 
image references from the FO and try again.  Still run out of memory or come 
close to it?  If so, it must not be the images.  Different versions of FOP 
handle images in different ways.  For example, in v0.20.5 (maybe 0.95 as well), 
JPG images were directly embedded into the PDF whereas PNG images were 
transcoded into bitmaps and compressed into the PDF.  As a consequence, JPG 
images used a lot less processor power but created, generally, larger PDFs.

I just noticed from your stack trace that it appears you're writing to a 
ByteArrayOutputStream.  That could be a big problem right there.  You're 
wanting to store the entire 12+MB PDF in memory in addition to the memory 
needed for FOP to function?  A much better option would be to write that file 
to disk.  Likewise, are you feeding the FO into FOP from a memory stream, a 
file stream, or as the result of an XML/XSLT transformation?  Of all options, 
if you're rendering files this big, you should use the FileInputStream method.

Sean

-Original Message-
From: vsyamala [mailto:vsyam...@gmail.com] 
Sent: Thursday, December 11, 2008 1:10 PM
To: fop-users@xmlgraphics.apache.org
Subject: Out of Memory error


Hi,

I am trying to generate PDF from 45MB .fo file, fop produces a pdf of about
12MB and it fails with out of memory errors. XMx and XMs options are set
for 2048MB(that's max I could go on the machine). And this .fo file is
referring to about 300 images of appx 48KB each. FOP should be able to
generate PDF. I am not sure if images are causing this issue. Does anyone
know if images are the issue? Here is the error:

Exception in thread main java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.io.ByteArrayOutputStream.write(Unknown Source)
at java.io.OutputStream.write(Unknown Source)
at org.apache.fop.pdf.PDFStream.add(PDFStream.java:60)
at
org.apache.fop.render.pdf.PDFRenderer.concatenateTransformationMatrix
(PDFRenderer.java:839)
at
org.apache.fop.render.AbstractPathOrientedRenderer.renderReferenceAre
a(AbstractPathOrientedRenderer.java:539)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:560)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:573)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)

Thanks,
Vsyamala

 
-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-error-tp20962360p20962360.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

--
CONFIDENTIALITY NOTICE This message and any included attachments are from 
Cerner Corporation and are intended only for the addressee. The information 
contained in this message is confidential and may constitute inside or 
non-public information under international, federal, or state securities laws. 
Unauthorized forwarding, printing, copying, distribution, or use of such 
information is strictly prohibited and may be unlawful. If you are not the 
addressee, please promptly delete this message and notify the sender of the 
delivery error by e-mail or you may call Cerner's corporate offices in Kansas 
City, Missouri, U.S.A at (+1) (816)221-1024.

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out Of Memory Error

2007-02-06 Thread Roland Neilands

Check the FAQ:
http://xmlgraphics.apache.org/fop/0.93/running.html#memory

Regards,
Roland 



[EMAIL PROTECTED] wrote:


Hello All,

I am trying to convert a huge XML document(approx about 110 pages 
long, 777KB) into a pdf document using an FO transformation. But it 
gives me the following error.


Exception in thread main java.lang.OutOfMemoryError

Can anybody guide me as to what is going wrong? Is there a limit on 
the size of the document that can be transformed?If so , can it be 
increased?


Thanks,
Sindhu


Email Firewall made the following annotations
-
--- NOTICE ---
This message is for the designated recipient only and may contain 
confidential, privileged or proprietary information. If you have 
received it in error, please notify the sender immediately and delete 
the original and any copy or printout. Unintended recipients are 
prohibited from making any other use of this e-mail. Although we have 
taken reasonable precautions to ensure no viruses are present in this 
e-mail, we accept no liability for any loss or damage arising from the 
use of this e-mail or attachments, or for any delay or errors or 
omissions in the contents which result from e-mail transmission.

-



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Out Of Memory Error

2007-02-06 Thread Jay Bryant
Out Of Memory ErrorI routinely make much larger files (600+ pages with lots of 
images, so 6+ MB filesize). Try the items mentioned in the following link:

http://xmlgraphics.apache.org/fop/0.93/running.html#memory

I have also used FOP to make PDF files of more than 2,000 pages, but those 
didn't have images (they were data catalogs for a data warehouse).

Jay Bryant
Bryant Communication Services
  - Original Message - 
  From: [EMAIL PROTECTED] 
  To: fop-users@xmlgraphics.apache.org 
  Sent: Tuesday, February 06, 2007 4:54 PM
  Subject: Out Of Memory Error


  Hello All, 

  I am trying to convert a huge XML document(approx about 110 pages long, 
777KB) into a pdf document using an FO transformation. But it gives me the 
following error.

  Exception in thread main java.lang.OutOfMemoryError 

  Can anybody guide me as to what is going wrong? Is there a limit on the size 
of the document that can be transformed?If so , can it be increased?

  Thanks, 
  Sindhu 



  Email Firewall made the following annotations 
  -
  --- NOTICE --- 
  This message is for the designated recipient only and may contain 
confidential, privileged or proprietary information. If you have received it in 
error, please notify the sender immediately and delete the original and any 
copy or printout. Unintended recipients are prohibited from making any other 
use of this e-mail. Although we have taken reasonable precautions to ensure no 
viruses are present in this e-mail, we accept no liability for any loss or 
damage arising from the use of this e-mail or attachments, or for any delay or 
errors or omissions in the contents which result from e-mail transmission. 
  -



Re: out of memory error

2006-08-31 Thread Jeremias Maerki

On 30.08.2006 17:30:09 Naveen Bandhu wrote:
 
 Currently we are using fop version 0.20.5, and get Out of memory error. I
 have seen the fix in trunk but we cant use the trunk because of the changes
 we have to do for XSL files to make our them compatiable to trunk version. 

That's a strange excuse for not upgrading. The newer FOP releases are
much more conformant to the XSL spec, whereas 0.20.5 did many things
wrong. Usually, a stylesheet is updated for the latest release in a
couple of hours.

 As per the fix in trunk, PageSequence.java has new method public void
 releasePageSequence(). I would like to do changes to fop_0.20.5, from where
 should I start? what all the classes to modified? I am trying to debug to
 find it myself but seems to be time consuming and im worried about side
 effects.

Frankly, that's the wrong approach. You'd invest a lot of time in a
software that's no longer maintained. The latest releases are coming
from a new line of development where lots of things have changed. The
whole layout engine is new. It might not at all be possible to port the
releasePageSequence() changes to FOP 0.20.5.

Furthermore, you don't tell us anything about the document you're having
problems with (type of document, size, number of images, number of
page-sequences...). We don't know if you've tried all the tips and
work-arounds we published. I would at least give FOP 0.92beta (or FOP
Trunk) a try.

 Thanks, Naveen 
 
  
 
  Leena,
 you don't say what FOP version you're using, but would you please getthe
 latest code (FOP Trunk) from SVN and recheck if you get anyimprovements?
 I've just committed a fix for a memory-leak which mightimprove your
 situation. Please report back on the results.On 03.08.2006 09:06:34 Madala,
 Leena \(GE Healthcare, consultant\) wrote: Hello team,  I'm facing
 similar issue. I'm getting out of memory error, when trying to generate a
 report. I have checked FAQ and tried all the options provided there...  I
 tried increasing Java virtual memory to 1024M and also removed page
 citation in the report generated. I also tried with new beta version of
 fop.jar. It didn't help.  Report which is to be generated will generate
 approximately 70 pages for 40 records in database. It works fine for 40
 records. We have to generate a similar report for 700 records. Report
 doesn't contain any images. It has table for every 2 pages. Please help me
 resolve the issue.   Thanks for your time  Thanks  Regards, Leena
 Madala Jeremias Maerki-



Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-10 Thread Louis . Masters

Most of our reports/docs use a single
table in a single page sequence. My Windows boxes run with a max
JVM of ~1.5G and are generally OK. Our reports range from one or
two pages to several hundred.

Under JDK 1.5 and/or Tomcat, have seen
several posts about memory issues with classloading and the perm gen space
so if that sounds like your environment, you may want to dejanews it.

-Lou
~~
LOG-NET, Inc.
The Logistics Network Management System
~~
230 Half Mile Road
Third Floor
Red Bank, NJ 07701
PH: 732-758-6800
FAX: 732-747-7497
http://www.LOG-NET.com
~~
CONFIDENTIAL  PRIVILEGED
Unless otherwise indicated or if obvious from the nature of the content,
the information contained herein is privileged and confidential information/work
product. The communication is intended for the use of the individual or
entity named above. If the reader of this transmission is not the
intended recipient, you are hereby notified that any dissemination,
distribution or copying of this communication is strictly prohibited. If
you have received this communication in error, please notify the sender
immediately by telephone (732-758-6800) or by electronic mail ([EMAIL PROTECTED]),
and destroy any copies, electronic, paper or otherwise, which you may have
of this communication. Thank you.
~~





Andreas L Delmelle [EMAIL PROTECTED]

08/09/2006 01:07 PM



Please respond to
fop-users@xmlgraphics.apache.org





To
fop-users@xmlgraphics.apache.org


cc



Subject
Re: out of memory error








On Aug 9, 2006, at 14:34, Luis Ferro wrote:

Hi,

 I'm using the latest fop version from svn as of yesterday...

 In the attachment i send the files i use to create a PDF with the

 index and
 one chapter of the book. The index references are all turned off 
 but it
 still gets out of heap...

 When trying to assemble the PDF in command line (environment in the
 attachment also)... it always gives the same error of lack of heap

 memory...

Please check whether your FOs contain multiple fo:page-sequences. 
Yes, a memory leak was fixed, but no, it won't matter if you cram a 
large number of tables (or one large table) into one single page- 
sequence...


Cheers,

Andreas

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: out of memory error

2006-08-09 Thread Luis Ferro

I'm using the latest fop version from svn as of yesterday...

In the attachment i send the files i use to create a PDF with the index and
one chapter of the book. The index references are all turned off but it
still gets out of heap...

When trying to assemble the PDF in command line (environment in the
attachment also)... it always gives the same error of lack of heap memory...

I've tryed with -XX Aggressive (spelling) as someone suggested and with
several options to increase the heap... but after processing the full .fo
file (when it stops to echo errors and i suppose it starts crunshing)...

(btw the problem also happens to fop 0.20.5)

The machine has 2gb of ram and is a pentium 4 HT...

I'm completly lost on how to make this work... :(

Thanx in advance,
Luis Ferro
Portugal
http://www.nabble.com/user-files/232/memory_problem_thrunk_svn_20060807.zip
memory_problem_thrunk_svn_20060807.zip 
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5724811
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

A report as such shouldn't pose a problem... 

Edit the fop.bat and add the -Xmx1024m argument before the -cp as i
did... it worked like a charm with 0.92b/thrunk...

To build the report try using the fop.bat at a command prompt... (it worked
for me will thru tomcat it was giving problems, but that is another issue)

Cheers...
LF
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5728922
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

It has LOADS of page sequences... it's a book with 2500 pages where from 3 or
4 pages, the template changes from one column page to 2 column page...

Is there a better way of doing this swap of columns?

Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
machine has a top of 1.5Gb adressable to java...)...

Will try now to simplefy everything as best as i can...

How can i test if there is memory leaks somewhere (i'm a programmer but i'm
very very green in java)?

Cheers...
LF
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5730306
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Andreas L Delmelle

On Aug 9, 2006, at 19:27, Luis Ferro wrote:

Hi,

It has LOADS of page sequences... it's a book with 2500 pages where  
from 3 or
4 pages, the template changes from one column page to 2 column  
page...


Is there a better way of doing this swap of columns?


Not that I'm aware of... (you're using conditional-page-masters, right?)
Can you do us a favor and try if using just one simple-page-master  
makes a difference? Does it also consume a large amount of memory?


Tinker a bit with the properties on each of the page-masters, see if  
that changes anything... Never mind the output, it's simply to narrow  
down the searching area. If the test succeeds with one simple-page- 
master, or with different settings we'll get a better idea of where  
to start looking.


Thanks in advance!



Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
machine has a top of 1.5Gb adressable to java...)...

Will try now to simplefy everything as best as i can...


Jeremias recently added a MemoryEater to the trunk with which we can  
test, using one of your FO fragments and copying it a given number of  
times.
Choose one representative page-sequence, and the structure of your  
conditional-page-master-alternatives, post them in a Bugzilla --so  
only those people that are interested need to download it-- and we'll  
have a look.


How can i test if there is memory leaks somewhere (i'm a programmer  
but i'm

very very green in java)?


There is no easy way, I'm afraid. You can use a profiling tool one  
one of the sessions --the JDK comes with some profiling facilities,  
if you're a console-geek ;)-- to have a look at what the reference  
trees in the heap look like at a certain point in the process, but  
you'd still need some basic understanding of the process to figure  
out which active references are totally unnecessary.


If you're willing to invest time in this, of course you'd be welcome  
to do so. If you have any questions or remarks, or need help  
interpreting the results of a profiling session, just direct them to  
fop-dev, or use Bugzilla to track the issue.


Thanks again!


Cheers,

Andreas


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Jeremias Maerki
Not sure that playing with the page-masters will really help. I don't
think they have an influence here.

I've just committed a helper XSLT [1] which can split Luis' FO file at
page-sequence boundaries. It reveals that one of the page-sequences
alone makes up 18MB out of the 23MB of the full FO file. I guess that is
what FOP chokes on: Just too much data in one page-sequence and FOP
currently cannot free any objects while inside a page-sequence.

[1] http://svn.apache.org/viewvc?rev=430134view=rev

I'd try to find way to further split up that large page-sequence. This
should enable FOP to free memory and handle this file with less heap
space.

On 09.08.2006 20:13:43 Andreas L Delmelle wrote:
 On Aug 9, 2006, at 19:27, Luis Ferro wrote:
 
 Hi,
 
  It has LOADS of page sequences... it's a book with 2500 pages where  
  from 3 or
  4 pages, the template changes from one column page to 2 column  
  page...
 
  Is there a better way of doing this swap of columns?
 
 Not that I'm aware of... (you're using conditional-page-masters, right?)
 Can you do us a favor and try if using just one simple-page-master  
 makes a difference? Does it also consume a large amount of memory?
 
 Tinker a bit with the properties on each of the page-masters, see if  
 that changes anything... Never mind the output, it's simply to narrow  
 down the searching area. If the test succeeds with one simple-page- 
 master, or with different settings we'll get a better idea of where  
 to start looking.
 
 Thanks in advance!
 
 
  Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
  machine has a top of 1.5Gb adressable to java...)...
 
  Will try now to simplefy everything as best as i can...
 
 Jeremias recently added a MemoryEater to the trunk with which we can  
 test, using one of your FO fragments and copying it a given number of  
 times.
 Choose one representative page-sequence, and the structure of your  
 conditional-page-master-alternatives, post them in a Bugzilla --so  
 only those people that are interested need to download it-- and we'll  
 have a look.
 
  How can i test if there is memory leaks somewhere (i'm a programmer  
  but i'm
  very very green in java)?
 
 There is no easy way, I'm afraid. You can use a profiling tool one  
 one of the sessions --the JDK comes with some profiling facilities,  
 if you're a console-geek ;)-- to have a look at what the reference  
 trees in the heap look like at a certain point in the process, but  
 you'd still need some basic understanding of the process to figure  
 out which active references are totally unnecessary.
 
 If you're willing to invest time in this, of course you'd be welcome  
 to do so. If you have any questions or remarks, or need help  
 interpreting the results of a profiling session, just direct them to  
 fop-dev, or use Bugzilla to track the issue.
 
 Thanks again!
 
 
 Cheers,
 
 Andreas


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

That sequence should be the chapter A ;)

Chapter C has about the double of A's size...

LF

(i will update the file with a new one with less... warnings/errors as soon
as i can - it started as a 0.20.5 system, and still needs some twiching)
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5738048
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-03 Thread Jeremias Maerki
Leena,

you don't say what FOP version you're using, but would you please get
the latest code (FOP Trunk) from SVN and recheck if you get any
improvements? I've just committed a fix for a memory-leak which might
improve your situation. Please report back on the results.

On 03.08.2006 09:06:34 Madala, Leena \(GE Healthcare, consultant\) wrote:
 Hello team,
  
 I'm facing similar issue. I'm getting out of memory error, when trying
 to generate a report.
 I have checked FAQ and tried all the options provided there...
  
 I tried increasing Java virtual memory to 1024M and also removed page
 citation in the report generated. I also tried with new beta version of
 fop.jar.
 It didn't help.
  
 Report which is to be generated will generate approximately 70 pages for
 40 records in database. It works fine for 40 records.
 We have to generate a similar report for 700 records. Report doesn't
 contain any images. It has table for every 2 pages.
 Please help me resolve the issue.
  
 Thanks for your time
  
 Thanks  Regards,
 Leena Madala


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Jeremias Maerki
20 to 25 pages does not really count as extra large. Does your
document contain a few images? If yes and assuming you use FOP 0.92beta
or later, you might want to try the latest code from FOP Trunk (from
Subversion) where I've fixed two memory leaks which dramatically
improves memory usage of FOP when using bitmap images.

I've never used the AggressiveHeap option before. I can't even tell from
the descriptions I found what it really does. If it helps, cool, but
sometimes these measures just cover an underlying problem that should be
fixed at some point.


On 25.07.2006 23:58:51 Rick Roen wrote:
 Update.
 
  
 
 I was googling in the wrong place.  When I looked in Java I found the switch
 -XX:+AggessiveHeap which fixed the problem.
 
  
 
 Is this the best switch to use, or is there something better?
 
  
 
 Rick
 
  
 
   _  
 
 From: Rick Roen [mailto:[EMAIL PROTECTED] 
 Sent: Tuesday, July 25, 2006 4:12 PM
 To: fop-users@xmlgraphics.apache.org
 Subject: out of memory error
 
  
 
 I'm running FOP from the current build of about a month ago on XP Pro SP2
 1.5GB RAM
 
  
 
 I have a command line routine that runs sales documents (packing list,
 invoice etc.) from xml  .- xslt (using Saxon 8 )-pdf.
 
  
 
 This works with no problem except when I have an extra large document (maybe
 20 or 25 pages, I'm not really sure since it does not work) when I get an
 error: 
 
  
 
 Exception in thread main java.lang.OutOfMemoryError: Java heap space
 
  
 
 I am using page-number-citation at the end of the document to print page
 x of y at the bottom of each page.  I seem to recall that this might force
 the entire document to be created in memory.  Does anyone know if this could
 be the problem, or is there some other way to reduce the memory consumption?
 
  
 
 Thanks,
 
  
 
 Rick
 



Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Pascal Sancho
 -Original Message-
 From: Rick Roen [mailto:[EMAIL PROTECTED] 
 
 Update...
 
 I was googling in the wrong place.  When I looked in Java I 
 found the switch -XX:+AggessiveHeap which fixed the problem.
 
 Is this the best switch to use, or is there something better?

Hi Rick,
You probably omitted the 'm' from '-xmx' option.
-xmx is the usual way to increase max memory for JVM.
You can use -xms in conjunction to define initial memory.

See [1] the whole java options.

Pascal

[1]
http://java.sun.com/j2se/1.4.2/docs/tooldocs/windows/java.html#options

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Bert Strobbe
We use the -Xmx1024m option, which allows us to generate a PDF of 1000
pages, with table of contents and bookmarks, page of numpages, and a
lot of other internal links. It takes like 20min on a Linux server to
generate this PDF with FOP 0.92 (we generate an internal xml file, which
is converted with a self-written xsl to fo, so this conversion is
included in the time). Without the option, we've got the same memory
problem. With FOP 0.20.5, we never succeeded to generate this kind of
PDF, with or without the -Xmx option.

Bert


-Oorspronkelijk bericht-
Van: Pascal Sancho [mailto:[EMAIL PROTECTED] 
Verzonden: woensdag 26 juli 2006 9:03
Aan: fop-users@xmlgraphics.apache.org; [EMAIL PROTECTED]
Onderwerp: RE: out of memory error

 -Original Message-
 From: Rick Roen [mailto:[EMAIL PROTECTED] 
 
 Update...
 
 I was googling in the wrong place.  When I looked in Java I 
 found the switch -XX:+AggessiveHeap which fixed the problem.
 
 Is this the best switch to use, or is there something better?

Hi Rick,
You probably omitted the 'm' from '-xmx' option.
-xmx is the usual way to increase max memory for JVM.
You can use -xms in conjunction to define initial memory.

See [1] the whole java options.

Pascal

[1]
http://java.sun.com/j2se/1.4.2/docs/tooldocs/windows/java.html#options

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Rick Roen
Thanks everyone for your responses.

The size does not appear large to me either, 25+ pages, but nevertheless it
does run out of memory. I have only three images (small) and several tables
with embedded tables, plus the page-number-citation for page x of y.

FYI in case it helps someone else, the -XX:+AggressiveHeap seems to do what
the combination of -Xmx and -Xms do, however automatically.  The only
documentation I could find at Sun suggested it was only for multiple
processor machines with lots of RAM, however it seems to work for me with
only one processor.

The advantage is that it allocates the Heap dynamically according to the
available RAM.  In my case, the client PC's have from 320MB to 1+GB of RAM
and they are all running locally from a batch file on the server.  Although
I could set the -Xmx and -Xms to the lowest common denominator, the -XX
seems to take care of this on whatever machine it is running on. 

Jeremias - I am running from trunk code from about a month ago.  Have you
done some memory adjustments since then?  I remember I used to have extra
large resulting PDF files when I had an image that did not have a color
table (or something ?), but the trunk code brought a 225KB file back down to
25KB.

Rick

-Original Message-
From: Jeremias Maerki [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, July 26, 2006 2:00 AM
To: fop-users@xmlgraphics.apache.org
Subject: Re: out of memory error

20 to 25 pages does not really count as extra large. Does your
document contain a few images? If yes and assuming you use FOP 0.92beta
or later, you might want to try the latest code from FOP Trunk (from
Subversion) where I've fixed two memory leaks which dramatically
improves memory usage of FOP when using bitmap images.

I've never used the AggressiveHeap option before. I can't even tell from
the descriptions I found what it really does. If it helps, cool, but
sometimes these measures just cover an underlying problem that should be
fixed at some point.



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Chris Bowditch

Rick Roen wrote:


Jeremias - I am running from trunk code from about a month ago.  Have you
done some memory adjustments since then?  I remember I used to have extra
large resulting PDF files when I had an image that did not have a color
table (or something ?), but the trunk code brought a 225KB file back down to
25KB.


Jeremias has made some memory optimisations to the way images are 
handled within FOP in the last week.


Chris



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Jeremias Maerki
Change was last week.

On 26.07.2006 14:38:39 Rick Roen wrote:
 Jeremias - I am running from trunk code from about a month ago.  Have you
 done some memory adjustments since then?

Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-25 Thread Rick Roen








Update



I was googling in the wrong place. When
I looked in Java I found the switch XX:+AggessiveHeap
which fixed the problem.



Is this the best switch to use, or is
there something better?



Rick











From: Rick Roen
[mailto:[EMAIL PROTECTED] 
Sent: Tuesday, July 25, 2006 4:12
PM
To:
fop-users@xmlgraphics.apache.org
Subject: out of memory error





Im running FOP from the current build of about a
month ago on XP Pro SP2 1.5GB RAM



I have a command line routine that runs sales documents
(packing list, invoice etc.) from xml .- xslt (using Saxon 8
)-pdf.



This works with no problem except when I have an extra large
document (maybe 20 or 25 pages, Im not really sure since it does not
work) when I get an error: 



Exception in thread main
java.lang.OutOfMemoryError: Java heap space



I am using page-number-citation at the end of
the document to print page x of y at the bottom of each page.
I seem to recall that this might force the entire document to be created
in memory. Does anyone know if this could be the problem, or is there
some other way to reduce the memory consumption?



Thanks,



Rick