Hi,

i am using this route for a couple of CSV file routes:

  from("file:/tmp/input/?delete=true")
  .splitter(body(InputStream.class).tokenize("\r\n"))
  .beanRef("myBean", "process")
  .to("file:/tmp/output/?append=true")

This works fine for small CSV files, but for big files i noticed
that camel uses a lot of memory, it seems that camel is reading
the file into memory. What is the configuration to use a stream
in the splitter?

I recognized the same behaviour in the xpath splitter:

  from("file:/tmp/input/?delete=true")
  .splitter(ns.xpath("//member"))
  ...

BTW, i found a posting from march, where James suggest following
implementation for an own splitter:

-- quote --

  from("file:///c:/temp?noop=true)").
    splitter().method("myBean", "split").
    to("activemq:someQueue")

Then register "myBean" with a split method...

class SomeBean {
  public Iterator split(File file) {
     /// figure out how to split this file into rows...
  }
}
-- quote --

But this won't work for me (Camel 1.4).

Bart

Reply via email to