Thanks Adam. I will try 0.7.1 and update the community on the outcome. If
it works then I can create a patch for 1.x
Thanks
Rai

On Thu, Oct 27, 2016 at 7:41 PM, Adam Lamar <[email protected]> wrote:

> Hey All,
>
> I believe OP is running into a bug fixed here:
> https://issues.apache.org/jira/browse/NIFI-2631
>
> Basically, ListS3 attempts to commit all the files it finds
> (potentially 100k+) at once, rather than in batches. NIFI-2631
> addresses the issue. Looks like the fix is out in 0.7.1 but not yet in
> a 1.x release.
>
> Cheers,
> Adam
>
>
> On Thu, Oct 27, 2016 at 7:59 PM, Joe Witt <[email protected]> wrote:
> > Looking at this line [1] makes me think the FetchS3 processor is
> > properly streaming the bytes directly to the content repository.
> >
> > Looking at the screenshot showing nothing out of the ListS3 processor
> > makes me think the bucket has so many things in it that the processor
> > or associated library isn't handling that well and is just listing
> > everything with no mechanism of max buffer size.  Krish please try
> > with the largest heap you can and let us know what you see.
> >
> > [1] https://github.com/apache/nifi/blob/master/nifi-nar-
> bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/
> org/apache/nifi/processors/aws/s3/FetchS3Object.java#L107
> >
> > On Thu, Oct 27, 2016 at 9:37 PM, Joe Witt <[email protected]> wrote:
> >> moving dev to bcc
> >>
> >> Yes I believe the issue here is that FetchS3 doesn't do chunked
> >> transfers and so is loading all into memory.  I've not verified this
> >> in the code yet but it seems quite likely.  Krish if you can verify
> >> that going with a larger heap gets you in the game can you please file
> >> a JIRA.
> >>
> >> Thanks
> >> Joe
> >>
> >> On Thu, Oct 27, 2016 at 9:34 PM, Bryan Bende <[email protected]> wrote:
> >>> Hello,
> >>>
> >>> Are you running with all of the default settings?
> >>>
> >>> If so you would probably want to try increasing the memory settings in
> >>> conf/bootstrap.conf.
> >>>
> >>> They default to 512mb, you may want to try bumping it up to 1024mb.
> >>>
> >>> -Bryan
> >>>
> >>> On Thu, Oct 27, 2016 at 5:46 PM, Gop Krr <[email protected]> wrote:
> >>>>
> >>>> Hi All,
> >>>>
> >>>> I have very simple data flow, where I need to move s3 data from one
> bucket
> >>>> in one account to another bucket under another account. I have
> attached my
> >>>> processor configuration.
> >>>>
> >>>>
> >>>> 2016-10-27 20:09:57,626 ERROR [Flow Service Tasks Thread-2]
> >>>> org.apache.nifi.NiFi An Unknown Error Occurred in Thread Thread[Flow
> Service
> >>>> Tasks Thread-2,5,main]: java.lang.OutOfMemoryError: Java heap space
> >>>>
> >>>> I am very new to NiFi and trying ot get few of the use cases going. I
> need
> >>>> help from the community.
> >>>>
> >>>> Thanks again
> >>>>
> >>>> Rai
> >>>>
> >>>>
> >>>>
> >>>
>

Reply via email to