Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-09 Thread Matthew Burgess
When you say you pointed LD_LIBRARY_PATH to the location of libsnappy.so, do 
you mean just the setting of the “mapreduce.admin.user.env” property in 
mapred-site.xml, or the actual environment variable before starting NiFi?  The 
mapred-site settings won’t be used as PutHDFS does not use MapReduce. If you do 
something like:

export LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native
bin/nifi.sh start

That should let PutHDFS know about the appropriate libraries.




On 2/9/16, 4:38 AM, "shweta"  wrote:

>Hi Jeremy,
>
>Even after copying libsnappy.so to java_home/jre/lib it did not help much. I
>also pointed LD_LIBRARY_PATH to the location of libsnappy.so. Even went to
>the extent of modyfying bootstrap.conf with jvm params 
> -Djava.library.path=//.
>
>But received the same error again. I have configured following properties in 
>Hadoop files as following:-
>
>core-site.xml
>
>
>  io.compression.codecs
>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
>
>
>mapred-site.xml
>
> 
>  mapreduce.map.output.compress
>  true
>
>
>
> mapred.map.output.compress.codec  
> org.apache.hadoop.io.compress.SnappyCodec
>
>
>
>
>  mapreduce.admin.user.env
>  LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native
>
>
>Anything else I'm missing on to get this issue fixed?? 
>
>Thanks,
>Shweta
>
>
>
>--
>View this message in context: 
>http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182p7236.html
>Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.



Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Jeremy Dyer
Shweta,

Looks like your missing the snappy native library. I have seen this several
times before. Assuming your on a linux machine you have 2 options. You can
copy the libsnappy.so native library to your JAVA_HOME/jre/lib native
directory. Or you can set LD_LIBRARY_PATH to point to where your
libsnappy.so native library is located on the machine.

I believe if you closely examine the files that are being written to HDFS
with a .snappy extension you will see that in fact that are not actually
snappy compressed.

Jeremy Dyer

On Sat, Feb 6, 2016 at 1:04 PM, Joe Witt  wrote:

> Can you show what is in your core-site.xml and the proc properties.
> Also can you show the full log output?
>
> Thanks
> Joe
>
> On Sat, Feb 6, 2016 at 9:11 AM, shweta  wrote:
> > Hi All,
> >
> > I'm getting a java.lang.UnsatisfiedLinkError while adding data into
> PutHDFS
> > processor with compression codec as snappy. The error message says
> "Failed
> > to write to HDFS due to
> > org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
> >
> > Inspite of this error, .snappy files are being written in my Hdfs.
> >
> > Has anyone faced a similar issue before or can provide any pointers.
> >
> > Thanks,
> > Shweta
> >
> >
> >
> > --
> > View this message in context:
> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
> > Sent from the Apache NiFi Developer List mailing list archive at
> Nabble.com.
>


java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread shweta
Hi All,

I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS
processor with compression codec as snappy. The error message says "Failed
to write to HDFS due to
org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.

Inspite of this error, .snappy files are being written in my Hdfs.

Has anyone faced a similar issue before or can provide any pointers.

Thanks,
Shweta



--
View this message in context: 
http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.


Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Joe Witt
Can you show what is in your core-site.xml and the proc properties.
Also can you show the full log output?

Thanks
Joe

On Sat, Feb 6, 2016 at 9:11 AM, shweta  wrote:
> Hi All,
>
> I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS
> processor with compression codec as snappy. The error message says "Failed
> to write to HDFS due to
> org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
>
> Inspite of this error, .snappy files are being written in my Hdfs.
>
> Has anyone faced a similar issue before or can provide any pointers.
>
> Thanks,
> Shweta
>
>
>
> --
> View this message in context: 
> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
> Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.


Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

2016-02-06 Thread Matt Burgess
To add to Jeremy's last point, even after the library is present, the files 
must be greater than the HDFS block size (default is 64 MB I think?) or 
Hadoop-snappy will also not compress them.

Sent from my iPhone

> On Feb 6, 2016, at 5:41 PM, Jeremy Dyer  wrote:
> 
> Shweta,
> 
> Looks like your missing the snappy native library. I have seen this several
> times before. Assuming your on a linux machine you have 2 options. You can
> copy the libsnappy.so native library to your JAVA_HOME/jre/lib native
> directory. Or you can set LD_LIBRARY_PATH to point to where your
> libsnappy.so native library is located on the machine.
> 
> I believe if you closely examine the files that are being written to HDFS
> with a .snappy extension you will see that in fact that are not actually
> snappy compressed.
> 
> Jeremy Dyer
> 
>> On Sat, Feb 6, 2016 at 1:04 PM, Joe Witt  wrote:
>> 
>> Can you show what is in your core-site.xml and the proc properties.
>> Also can you show the full log output?
>> 
>> Thanks
>> Joe
>> 
>>> On Sat, Feb 6, 2016 at 9:11 AM, shweta  wrote:
>>> Hi All,
>>> 
>>> I'm getting a java.lang.UnsatisfiedLinkError while adding data into
>> PutHDFS
>>> processor with compression codec as snappy. The error message says
>> "Failed
>>> to write to HDFS due to
>>> org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
>>> 
>>> Inspite of this error, .snappy files are being written in my Hdfs.
>>> 
>>> Has anyone faced a similar issue before or can provide any pointers.
>>> 
>>> Thanks,
>>> Shweta
>>> 
>>> 
>>> 
>>> --
>>> View this message in context:
>> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
>>> Sent from the Apache NiFi Developer List mailing list archive at
>> Nabble.com.
>>