Hi, I have the following camel + spring dsl:
<camel:route id="test-route" startupOrder="2"> <camel:from ref="fileConsumer"/> <camel:split streaming="true"> <camel:tokenize token="\*\*\n"/> <camel:bean ref="unmarshaller" method="unmarshall"/> <camel:bean ref="transformer" method="transform"/> <camel:bean ref="persister" method="persist"/> <camel:to uri="direct:out"/> </camel:split> </camel:route> The data in the file is in the format: **START IB 1407112554 AV NYP 03/05/2010 BI Paperback AU BRAUND, HILARY IU ILLUSTRATED PD 2010/05/03 NP 48 RP 20.00 RI 20.00 RE 20.00 PU SCHOLASTIC ACADEMIC YP 2010 SR WRITING GUIDES TI FUNNY STORIES 05-07 BOOK & CD ROM PI Reissue. EA 9781407112558 RF R SG 2 GC I00 I3 9781407112558 ** IB 1407112554 AV NYP 03/05/2010 BI Paperback AU BRAUND, HILARY IU ILLUSTRATED PD 2010/05/03 NP 48 RP 20.00 RI 20.00 RE 20.00 PU SCHOLASTIC ACADEMIC YP 2010 SR WRITING GUIDES TI FUNNY STORIES 05-07 BOOK & CD ROM PI Reissue. EA 9781407112558 RF R SG 2 GC I00 I3 9781407112558 **END,2 This is working perfectly on my dev machine with a subset of the live data. The dev machine is Ubuntu 9.10. When I put this into production, the consumer runs, but the split/tokenization stage doesn't seem to execute and the rest of the route doesn't get called. The production machine is RHEL5. Both machines are configured to use UTF-8. Opening the data file in less I can search and find the record delimiters using \*\*$, but I believe that \*\*\n is the correct token to use. Does anyone have experience of a similar situation? Thanks, Kev