Hi, I am trying to process a 10GB file (fixed width file) by performing the below steps 1. Split the file per record based on the newline <route> <from uri="file:src/data?noop=true" /> <log message="Started Processing" loggingLevel="INFO" /> <split streaming="true"> <tokenize token="\n" /> <to uri="seda:WriteToFile2" /> </split> </route> 2. Perform Filter and transform the data that will result in 3 different CSV files (| delimited) 3. Add header column to all the 3 files 4. Zip the end result into a GZip file
The problem I am facing is in the step 3 and 4 Below code is how I am trying to aggregate and write to a file, I am using <process ref="AddHeader" /> to add header before I the data into the file, however, the header gets added multiple times, one for every 30 seconds, within the output file, as the aggregation completionInterval is set to "30000". I am facing the same issue with creating the ZIP file. Is there a way to identify the last record being processed and store the flag in a global variable, using this variable value I can add the header and zip the file once all the records are processed. <route> <from uri="seda:NullProcessing?concurrentConsumers=1" /> <aggregate strategyRef="aggregatorStrategy" completionInterval="30000"> <correlationExpression> <constant>true</constant> </correlationExpression> <to uri="seda:processedRejects" /> </aggregate> </route> <route> <from uri="seda:processedRejects" /> <process ref="AddHeader" /> <setHeader headerName="CamelFileName"> <simple>ready_attributeList_inventory_onhand_Rejects</simple> </setHeader> <to uri="file:src/data/output?fileExist=Append" /> </route> -- View this message in context: http://camel.465427.n5.nabble.com/How-to-identify-the-last-record-that-is-getting-processed-tp5774793.html Sent from the Camel - Users mailing list archive at Nabble.com.