We had to greatly enlarge the amount of RAM (up 48GB) in order to handle large 
files in our dataflows (> 10MB). Our prod boxes have 128GB RAM.  Our flows tend 
to surge wildly  in velocity and size; we can get 28GB of files across the five 
node cluster at times.  We’ve locked up the cluster at 18GB RAM under heavy 
demand and the only way to clear it was to restart NiFi with more RAM.  We have 
a zero-loss message tolerance so wiping out the content or flowfile repos was 
not an option..  100MB files times thousands of files adds up fast especially 
if content is added as attributes….

Mike Woodcock

From: Joe Witt [mailto:[email protected]]
Sent: Wednesday, July 22, 2020 7:40 AM
To: [email protected]
Subject: Re: Urgent: HDFS processors throwing OOM - Compressed class space 
exception

The files arent read into memory anyway.

The class space filling up is what to focus on.

On Wed, Jul 22, 2020 at 2:34 AM Mohit Jain 
<[email protected]<mailto:[email protected]>> wrote:
Those are the small files not more than 100 MB each.
Nifi is configured to 16g.

Thanks

Get Outlook for iOS<https://aka.ms/o0ukef>
________________________________
From: Jorge Machado <[email protected]<mailto:[email protected]>>
Sent: Wednesday, July 22, 2020 2:55:26 PM

To: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>
Subject: Re: Urgent: HDFS processors throwing OOM - Compressed class space 
exception

How big are the files that you are trying to store? How much memory did you 
configure nifi ?

> On 22. Jul 2020, at 06:13, Mohit Jain 
> <[email protected]<mailto:[email protected]>> wrote:
>
> Hi team,
>
> I’ve been facing the issue while using any HDFS processor, e.g. - PutHDFS 
> throws the error -
> Failed to write to HDFS due to compressed class space: 
> java.lang.OutOFMemoryError
>
> Eventually the node gets disconnected.
>
> Any help would be appreciated.
>
> Regards,
> Mohit
> <Image.jpeg>

Reply via email to