What version of Hadoop are you using? The prebuilt binaries are built against Hadoop 1.x. You can either build flume against Hadoop 2 or use one of the packages provided by Apache bigtop or any vendor whose packages are built against Hadoop 2.
Thanks, Hari On Tue, Dec 23, 2014 at 4:23 AM, adarsh deshratnam <[email protected]> wrote: > I am using flume Version 1.4.0 > Please let me know if anyone has faced the same issue and tried resolving > it. > Thanks, > Adarsh D > On Tue, Dec 23, 2014 at 4:45 PM, adarsh deshratnam < > [email protected]> wrote: >> Hi, >> While running flume i am getting InvalidProtocolBufferException. >> >> below is my configuration file : >> >> ------------------------------------------------------------------------------------------------------------------------------ >> a1.sources = r1 >> a1.channels = c1 >> a1.sinks = k1 >> >> >> a1.sources.r1.type = exec >> # Here take any sample file use cat or tail command >> a1.sources.r1.command = cat /home/wmsuser/BPM.log >> a1.sources.r1.channels = c1 >> >> >> a1.sinks.k1.type = hdfs >> a1.sinks.k1.channel = c1 >> a1.sinks.k1.hdfs.path = hdfs://localhost:localhost:22/flume >> a1.sinks.k1.hdfs.filePrefix = events1 >> a1.sinks.k1.hdfs.round = true >> a1.sinks.k1.hdfs.roundValue = 5 >> a1.sinks.k1.hdfs.roundUnit = minute >> a1.sinks.k1.hdfs.writeFormat = Text >> >> >> a1.channels.c1.type = file >> a1.channels.c1.checkpointDir = /home/wmsuser/channel/tmpData/checkpoint >> a1.channels.c1.dataDirs = /home/wmsuser/channel/tmpData/data >> >> >> -------------------------------------------------------------------------------------------------------------------------------- >> >> >> >> Exception: >> >> om.google.protobuf.InvalidProtocolBufferException: Protocol message tag >> had invalid wire type. >> at >> com.google.protobuf.InvalidProtocolBufferException.invalidWireType(InvalidProtocolBufferException.java:99) >> at >> com.google.protobuf.UnknownFieldSet$Builder.mergeFieldFrom(UnknownFieldSet.java:498) >> at >> com.google.protobuf.GeneratedMessage.parseUnknownField(GeneratedMessage.java:193) >> at >> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(RpcHeaderProtos.java:1404) >> at >> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(RpcHeaderProtos.java:1362) >> at >> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1492) >> at >> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(RpcHeaderProtos.java:1487) >> at >> com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) >> at >> com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) >> at >> com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) >> at >> com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) >> at >> com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) >> at >> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:2364) >> at >> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1056) >> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:950) >> >> >> Thanks, >> Adarsh D >>
