How can prevent the hdfs directory to be overwritten while consuming different files? Currently, every new file overwrites the directory in HDFS and I loose all content of that directory.
<from uri="file:///opt/rcggs/users?fileName=users_aa&noop=true&delete=false"/> <to uri="hdfs2://xxx.xxx.xxx.xxx:8020/tmp/camel/"/> If I ran the above route again using different files, it will overwrite the destination directory. I can make it work using append=true, but that was not intended for files but rather for content. thank you for you help! -- View this message in context: http://camel.465427.n5.nabble.com/file-to-hdfs-2-tp5766998.html Sent from the Camel - Users mailing list archive at Nabble.com.
