Thad This thread [1] seems related. Take a look and see if that helps. The basic gist as I understand it is we won't have access to that native library unless it is pointed to somewhere or unless the Java code that calls it knows how to set/find it for you.
[1] http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-td7182.html On Wed, Mar 30, 2016 at 8:42 AM, Thad Guidry <[email protected]> wrote: > My bad....there is... in the app log... > > 2016-03-30 09:39:27,709 INFO [Write-Ahead Local State Provider Maintenance] > org.wali.MinimalLockingWriteAheadLog > org.wali.MinimalLockingWriteAheadLog@7615666e checkpointed with 8 Records > and 0 Swap Files in 69 milliseconds (Stop-the-world time = 6 milliseconds, > Clear Edit Logs time = 4 millis), max Transaction ID 23 > 2016-03-30 09:39:31,979 INFO [pool-16-thread-1] > o.a.n.c.r.WriteAheadFlowFileRepository Initiating checkpoint of FlowFile > Repository > 2016-03-30 09:39:32,380 INFO [pool-16-thread-1] > org.wali.MinimalLockingWriteAheadLog > org.wali.MinimalLockingWriteAheadLog@174f0d06 checkpointed with 3 Records > and 0 Swap Files in 400 milliseconds (Stop-the-world time = 273 > milliseconds, Clear Edit Logs time = 74 millis), max Transaction ID 9785 > 2016-03-30 09:39:32,380 INFO [pool-16-thread-1] > o.a.n.c.r.WriteAheadFlowFileRepository Successfully checkpointed FlowFile > Repository with 3 records in 400 milliseconds > 2016-03-30 09:39:32,523 ERROR [Timer-Driven Process Thread-9] > o.apache.nifi.processors.hadoop.PutHDFS > PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due > to java.lang.RuntimeException: native lz4 library not available: > java.lang.RuntimeException: native lz4 library not available > 2016-03-30 09:39:32,525 ERROR [Timer-Driven Process Thread-9] > o.apache.nifi.processors.hadoop.PutHDFS > java.lang.RuntimeException: native lz4 library not available > at > org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87) > ~[hadoop-common-2.6.2.jar:na] > at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807) > ~[na:na] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778) > ~[na:na] > at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > [nifi-api-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > [na:1.8.0_74] > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > [na:1.8.0_74] > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74] > 2016-03-30 09:39:34,273 ERROR [Timer-Driven Process Thread-5] > o.apache.nifi.processors.hadoop.PutHDFS > PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due > to java.lang.RuntimeException: native lz4 library not available: > java.lang.RuntimeException: native lz4 library not available > 2016-03-30 09:39:34,274 ERROR [Timer-Driven Process Thread-5] > o.apache.nifi.processors.hadoop.PutHDFS > java.lang.RuntimeException: native lz4 library not available > at > org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87) > ~[hadoop-common-2.6.2.jar:na] > at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807) > ~[na:na] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778) > ~[na:na] > at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > [nifi-api-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > [na:1.8.0_74] > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > [na:1.8.0_74] > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74] > 2016-03-30 09:39:35,295 ERROR [Timer-Driven Process Thread-9] > o.apache.nifi.processors.hadoop.PutHDFS > PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due > to java.lang.RuntimeException: native lz4 library not available: > java.lang.RuntimeException: native lz4 library not available > 2016-03-30 09:39:35,296 ERROR [Timer-Driven Process Thread-9] > o.apache.nifi.processors.hadoop.PutHDFS > java.lang.RuntimeException: native lz4 library not available > at > org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131) > ~[hadoop-common-2.6.2.jar:na] > at > org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87) > ~[hadoop-common-2.6.2.jar:na] > at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807) > ~[na:na] > at > org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778) > ~[na:na] > at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270) > ~[nifi-hdfs-processors-0.6.0.jar:0.6.0] > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > [nifi-api-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123) > [nifi-framework-core-0.6.0.jar:0.6.0] > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > [na:1.8.0_74] > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > [na:1.8.0_74] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > [na:1.8.0_74] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > [na:1.8.0_74] > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74] > > Thad > +ThadGuidry
