I have configured Flume with HDFS Sink and I am facing an issue, no matter
which port I run the NameNode, I still get an HDFS IO Error



17 Jul 2013 22:09:12,508 INFO  [hdfs-VisitSink-call-runner-4]
(org.apache.flume.sink.hdfs.BucketWriter.doOpen:208)  - Creating
/home/cloudera/btbridge/data/visit/export.1374113279142.txt.tmp

17 Jul 2013 22:09:12,532 WARN
[SinkRunner-PollingRunner-DefaultSinkProcessor]
(org.apache.flume.sink.hdfs.HDFSEventSink.process:456)  - HDFS IO error

java.io.IOException: Failed on local exception: java.io.IOException: Broken
pipe; Host Details : local host is: "localhost.localdomain/127.0.0.1";
destination host is: "localhost.localdomain":8020;


I have tried the following in flume-conf.properties



agent1.sinks.PurePathSink.hdfs.path = /home/cloudera/btbridge/data/pp



or



agent1.sinks.PurePathSink.hdfs.path =
hdfs://localhost:8020/home/cloudera/btbridge/data/pp



or

agent1.sinks.PurePathSink.hdfs.path =
hdfs://localhost:9000/home/cloudera/btbridge/data/pp



etc

Any pointers on what is missing. Is this a flume issue or Hadoop issue


Thanks,

Rajesh

Reply via email to