SplitRecord with a CSVReader and setting the "Records Per Split" may be what you're looking for.
Depending on your use case, you may be able to leave the file whole and use Record processors throughout your flow - they process individual records from a file in a stream like manner, i.e. without reading the entire file into memory. For CSV, 1 Record = 1 row of text and the header can be used to define the schema. Cheers, Chris Sampson On Mon, 21 Dec 2020, 02:35 naga satish, <[email protected]> wrote: > Is there a way that I can extract records in batches from a CSV file > sequentially in NiFi? To be more specific let's say if my file have 1 > billion rows. i want to extract first 1M rows and then next 1M rows and > then next 1M rows and so on.. >
