I think you can do this on one node, but you will need to run two instances of flume, each with a different agent name.
Paul From: Suhas Satish [mailto:suhas.sat...@gmail.com] Sent: Thursday, September 12, 2013 10:58 AM To: user@flume.apache.org Subject: Re: flume 1.4.0 avro source/sink with hdfs sink configuration - no hdfs files created Thanks. Yes I was trying to set it up on a single node. If it cannot be done, I can go to 2 different nodes, but that would add additional complexities which I'd like to avoid if possible. My original intent was to test if the Avro Source/Avro Sink interface can work with SSL enabled (hence the extra hop) and if it can, can it use the ssl_keystore and ssl_truststore already available from a secure hadoop cluster. On Thu, Sep 12, 2013 at 10:48 AM, Paul Chavez <pcha...@verticalsearchworks.com<mailto:pcha...@verticalsearchworks.com>> wrote: et this all up on a single node? If so, why are you adding in an extra Avro hop? In practice this setup should be on two nodes, one acting as the 'agent' with the exec sour Cheers, Suhas.