Hi, How to execute/process High volume data with ExecuteSQL processor:
We tried to execute query for db2 database which has around 10 lakh records. While executing this query we are getting OutOfMemory error and that request(flowfile) is stuck in queue. When we restart nifi, it still stuck in queue and as soon as we start nifi, we are again getting same error as it is stuck in queue. Is there any way to configure retry for queue(connection to 2 processor). We also tried to change property for Flow File repository in nifi.properties (nifi.flowfile.repository.implementation) to 'org.apache.nifi.controller.repository.VolatileFlowFileRepository', This is removing flowfile in query while restarting nifi. But it has risk of data loss in the event of power/machine failure for other processes. So please suggest how to execute high volume data query execution or any retry mechanism available for queued flowfile. Regards, Dnyaneshwar Pawar
