Hello, We have an usecase to process a file that has around 10k records. We are using latest apache-camel 3.7.0, jdk 11 The route we use is
from(ContextFrom()) .shutdownRunningTask(ShutdownRunningTask.CompleteAllTasks) .convertBodyTo(String.class, StandardCharsets.ISO_8859_1.name()) .onCompletion() .process(PostProcessor) .to(CamelContextTo()) .end() .split(body().tokenize(System.lineSeparator())) .streaming() .parallelProcessing() .choice() .when(headerCondition()) .to(bean("OutputDto").method("setHeaderNTrailer")) .otherwise() .unmarshal(dto) .process(FileProcessor) .to(bean("OutputDto").method("setDto")) .end(); we have also used camel thread pool configuration with the below properties ThreadPoolProfile masterPoolProfile = new ThreadPoolProfile("masterPoolProfile"); masterPoolProfile.setMaxPoolSize(2); masterPoolProfile.setMaxQueueSize(2); masterPoolProfile.setPoolSize(2); masterPoolProfile.setKeepAliveTime(2L); masterPoolProfile.setTimeUnit(TimeUnit.MINUTES); this.getContext().getExecutorServiceManager().setDefaultThreadPoolProfile(masterPoolProfile); Without parallel processing, the output generated has 10k records but with parallel processing, some random records are missing out and the behavior is intermittent. We tried to check the documentation and the existing questions for parallel processing but couldn't see if we are missing any thing, can anyone please guide on this issue. Regards, Suvendu