Hello everybody,

I need your help. I use Talend ESB and I want to make java beans to cut
file.

For example, I have this flat file:

    11886 1855 0000004309000
    11886 1855 0000057370000
    11886 1856 0000057374001    
    11886 1856 0000057375000  

In my example I want 2 files (messages), a filter of "1855" and "1856" (It's
the number of orders).

First file:

    11886 1855 0000004309000
    11886 1855 0000057370000[/CODE]

Second file:

    11886 1856 0000057374001     
    11886 1856 0000057375000[/CODE]

But I don't know the number of orders per file (depending on the file).

If i have three orders (three lines each) in my original file ==> I want
three files with the 3 lines of each order.

If i have four orders in my original file ==> I want four files.

If i have five orders in my original file ==> I want five files.

and so on .......................

This is my start but it return nothing:

package beans;
    
    import java.io.BufferedReader;
    import java.io.BufferedWriter;
    import java.io.ByteArrayInputStream;
    import java.io.File;
    import java.io.FileWriter;
    import java.io.IOException;
    import java.io.InputStream;
    import java.io.InputStreamReader;
    import java.util.HashMap;
    import java.util.Iterator;
    import java.util.Map;
    import java.util.Set;
    import java.util.TreeSet;
    
    import org.apache.camel.*;
    
    
    public class bean_test implements Processor{
    
        private static final String ENDPOINT = "aggregateEndpoint";
        private static final int NUMERO_SITE_START_POSITION = 46;
        private static final int NUMERO_SITE_END_POSITION = 55;
    
    
        @Override
        public void process(Exchange exchange) throws Exception {
    
            ProducerTemplate producerTemplate =
exchange.getContext().createProducerTemplate();
            String endpoint = exchange.getIn().getHeader(ENDPOINT,
String.class);
            InputStream is = new
ByteArrayInputStream(exchange.getIn().getBody(String.class).getBytes());
            aggregateBody(producerTemplate, is, endpoint, new
HashMap<String, Object>(exchange.getIn().getHeaders()));
    
        }
    
        private void aggregateBody(ProducerTemplate producerTemplate,
InputStream content, String endpoint, Map<String, Object> headers){
            BufferedReader br = new BufferedReader(new
InputStreamReader(content));
            String line;
            Set<String> order=new TreeSet<String>();
    
            try {
                String lineId = null;   
                while((line = br.readLine()) != null){
                    lineId = line.substring(NUMERO_SITE_START_POSITION,
NUMERO_SITE_END_POSITION);
                    order.add(lineId);
                }
    
                for(int i=0;i<order.size();i++){
                    String key = "file" + i;
                    File F = new File(key);
                    Iterator it = order.iterator();
                    FileWriter fw = new FileWriter(F.getAbsoluteFile());
                    BufferedWriter bw = new BufferedWriter(fw);
    
                    while((line = br.readLine()) != null){
                        while(it.hasNext()){
                            lineId =
line.substring(NUMERO_SITE_START_POSITION, NUMERO_SITE_END_POSITION);
                            if (lineId.equals(it.next())) {
                                bw.write(line);
                            }
                        }
    
                    }
                }
    
    
            } catch (IOException e) {
                e.printStackTrace();
            }
            finally{
                try {
                    if(br != null)br.close();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }

can you help me please ?

Thank you in advance.



--
View this message in context: 
http://camel.465427.n5.nabble.com/Cut-file-tp5785993.html
Sent from the Camel Development mailing list archive at Nabble.com.

Reply via email to