JackHintonSmartDCSIT commented on PR #8691:
URL: https://github.com/apache/nifi/pull/8691#issuecomment-2135328279

   > To the question about manipulating content as it is streamed in, 
`SplitText` and `SplitContent` are capable of reading large files and breaking 
them up into smaller FlowFiles.
   
   Ah fair enough, so if I were to chain FetchFile and SplitContent to split a 
file into 100MB chunks, I should be able to start reading a 200TB file from 
disk and get some 100MB chunks back before the initial load of the 200TB file 
has fully completed?
   
   This is vaguely unrelated to this processor by the way, we're just having to 
process some very large files and we're wondering how best to go about it. 
Thanks for the replies!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to