When I’ve had to do this I just skip trying to use ListFile and instead create 
a text file containing a list of all the files that can be used with the 
SplitFile and FetchFile processors to pull things in in batches. Even with 
filtering ListFile will iterate through a lot of files.

Thanks

From: Edward Armes <[email protected]>
Reply-To: "[email protected]" <[email protected]>
Date: Monday, March 9, 2020 at 4:43 AM
To: "[email protected]" <[email protected]>
Subject: Re: Listing a folder with millions of files

Hi Jeremy,

In this case I don't think there is an easy answer here.

You may have some luck with adjusting the max runtime of the processor but 
without checking the the processors implementation I couldn't know for certain 
if that would have any effect.

Edward
On Mon, 9 Mar 2020, 06:34 Jeremy Pemberton-Pigott, 
<[email protected]<mailto:[email protected]>> wrote:
Hi,

I need to list a sub-set (few 100,000) of files in a folder with millions of 
files (to do some historical processing).  What's the best way I can do that?  
ListFiles is taking way too long and seems to try to dump the entire list to 
the flow when I test it on a smaller folder list.  It would be good if the 
listing emitted files in smaller chunks so that the flow can start working on 
them.

Regards,

Jeremy

Reply via email to