Please check https://github.com/xerial/snappy-java for how to build /
install snappyjava.

On Thu, Dec 28, 2017 at 5:29 AM, Debraj Manna <subharaj.ma...@gmail.com>
wrote:

> Hi
>
> I am seeing an warning like below and my kafka java producer client is not
> able to write to kafka broker. (Kafka version 0.10.0 both client & server)
>
> WARN  Error while fetching metadata with correlation id 3 :
> {abcdef=LEADER_NOT_AVAILABLE}
>
>
>    - OS - 14.04.1-Ubuntu
>    - Java - 8
>
>
> In kafka server.log I am seeing exception like below. I am using single
> node kafka broker and zookeeper running on the same host.
>
> 2017-12-28 12:35:30,515] ERROR [Replica Manager on Broker 0]: Error
> processing append operation on partition Topic3-DC0P6PI-0
> (kafka.server.ReplicaManager)
> java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
>         at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
>         at java.lang.Runtime.loadLibrary0(Runtime.java:870)
>         at java.lang.System.loadLibrary(System.java:1122)
>         at
> org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:178)
>         at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:152)
>         at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
>         at
> org.xerial.snappy.SnappyInputStream.hasNextChunk(
> SnappyInputStream.java:435)
>         at
> org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:167)
>         at java.io.DataInputStream.readFully(DataInputStream.java:195)
>         at java.io.DataInputStream.readLong(DataInputStream.java:416)
>         at
> kafka.message.ByteBufferMessageSet$$anon$1.readMessageFromStream(
> ByteBufferMessageSet.scala:118)
>         at
> kafka.message.ByteBufferMessageSet$$anon$1.liftedTree2$1(
> ByteBufferMessageSet.scala:107)
>         at
> kafka.message.ByteBufferMessageSet$$anon$1.<init>(ByteBufferMessageSet.
> scala:105)
>         at
> kafka.message.ByteBufferMessageSet$.deepIterator(
> ByteBufferMessageSet.scala:85)
>         at
> kafka.message.ByteBufferMessageSet$$anon$2.makeNextOuter(
> ByteBufferMessageSet.scala:356)
>         at
> kafka.message.ByteBufferMessageSet$$anon$2.makeNext(ByteBufferMessageSet.
> scala:369)
>         at
> kafka.message.ByteBufferMessageSet$$anon$2.makeNext(ByteBufferMessageSet.
> scala:324)
>         at
> kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:64)
>         at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:56)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>         at kafka.utils.IteratorTemplate.foreach(IteratorTemplate.scala:30)
>         at
> kafka.message.ByteBufferMessageSet.validateMessagesAndAssignOffse
> ts(ByteBufferMessageSet.scala:427)
>         at kafka.log.Log.liftedTree1$1(Log.scala:339)
>         at kafka.log.Log.append(Log.scala:338)
>         at kafka.cluster.Partition$$anonfun$11.apply(Partition.scala:443)
>         at kafka.cluster.Partition$$anonfun$11.apply(Partition.scala:429)
>         at kafka.utils.CoreUtils$.inLock(CoreUtils.scala:231)
>         at kafka.utils.CoreUtils$.inReadLock(CoreUtils.scala:237)
>         at
> kafka.cluster.Partition.appendMessagesToLeader(Partition.scala:429)
>         at
> kafka.server.ReplicaManager$$anonfun$appendToLocalLog$2.
> apply(ReplicaManager.scala:406)
>         at
> kafka.server.ReplicaManager$$anonfun$appendToLocalLog$2.
> apply(ReplicaManager.scala:392)
>         at
> scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:244)
>         at
> scala.collection.TraversableLike$$anonfun$map$
> 1.apply(TraversableLike.scala:244)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.
> apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.
> apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>         at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>         at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>

Reply via email to