[
https://issues.apache.org/jira/browse/CAMEL-20556?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17825689#comment-17825689
]
Claus Ibsen commented on CAMEL-20556:
-------------------------------------
Try with 4.x releases, and 3.22.x
> FileConsumer OutOfMemory for big files in Unix
> ----------------------------------------------
>
> Key: CAMEL-20556
> URL: https://issues.apache.org/jira/browse/CAMEL-20556
> Project: Camel
> Issue Type: Bug
> Components: camel-file
> Affects Versions: 3.20.9
> Reporter: Alexander Anpilov
> Priority: Minor
>
> For clarity, I have tested and described issue with simple routes:
>
> 1. *FileConsumer*
> **
> {code:java}
> from("file:/path")
> .log("Received file: ${file:name}"); {code}
>
>
> 2. *File pollEnrich*
> {code:java}
> from("direct:start")
> .setProperty("SOURCE_URI", simple("{{path_to_folder}}"))
> .pollEnrich().exchangeProperty("SOURCE_URI").timeout(60000)
> .log("Received file: ${file:name}"); {code}
> My process receive big files (more than 4x of Java XMX) and running as
> spring-boot app on Kubernetes.
> After upgrading from camel-3.20.2 to camel-3.20.9, the OutOfMemory error
> occurs on consume. I have rolled back and checked with camel-3.20.2 again on
> the same files - everything OK.
> It seems, that FileConsumer publish file content into memory, instead of
> using streams.
> The problem starts from version camel-3.20.3 and reproduced up to 3.20.9.
> *BUT:* I couldn't reproduce problem on Windows, only in Unix env.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)