Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-09 Thread Matthew Burgess
When you say you pointed LD_LIBRARY_PATH to the location of libsnappy.so, do you mean just the setting of the “mapreduce.admin.user.env” property in mapred-site.xml, or the actual environment variable before starting NiFi? The mapred-site settings won’t be used as PutHDFS does not use

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Jeremy Dyer
Shweta, Looks like your missing the snappy native library. I have seen this several times before. Assuming your on a linux machine you have 2 options. You can copy the libsnappy.so native library to your JAVA_HOME/jre/lib native directory. Or you can set LD_LIBRARY_PATH to point to where your

java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread shweta
Hi All, I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS processor with compression codec as snappy. The error message says "Failed to write to HDFS due to org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z. Inspite of this error, .snappy files are being

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Joe Witt
Can you show what is in your core-site.xml and the proc properties. Also can you show the full log output? Thanks Joe On Sat, Feb 6, 2016 at 9:11 AM, shweta wrote: > Hi All, > > I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS > processor

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Matt Burgess
To add to Jeremy's last point, even after the library is present, the files must be greater than the HDFS block size (default is 64 MB I think?) or Hadoop-snappy will also not compress them. Sent from my iPhone > On Feb 6, 2016, at 5:41 PM, Jeremy Dyer wrote: > > Shweta, >