Think it may be needed on Windows, certainly if you start trying to work with 
local files. 


> On 4 Aug 2015, at 00:34, Sean Owen <so...@cloudera.com> wrote:
> 
> It won't affect you if you're not actually running Hadoop. But it's
> mainly things like Snappy/LZO compression which are implemented as
> native libraries under the hood.

There's a lot more in those native libs, primarily to bypass bits missing from 
the java APIs (FS permissions) and to add new features (encryption, soon 
erasure coding).

The Hadoop file:// FS uses it on windows, at least for now

> Spark doesn't necessarily use these
> anyway; it's from the Hadoop libs.
> 
> On Tue, Aug 4, 2015 at 8:30 AM, Deepesh Maheshwari
> <deepesh.maheshwar...@gmail.com> wrote:
>> Can you elaborate about the things this native library covering.
>> One you mentioned accelerated compression.
>> 
>> It would be very helpful if you can give any useful to link to read more
>> about it.
>> 
>> On Tue, Aug 4, 2015 at 12:56 PM, Sean Owen <so...@cloudera.com> wrote:
>>> 
>>> You can ignore it entirely. It just means you haven't installed and
>>> configured native libraries for things like accelerated compression,
>>> but it has no negative impact otherwise.
>>> 
>>> On Tue, Aug 4, 2015 at 8:11 AM, Deepesh Maheshwari
>>> <deepesh.maheshwar...@gmail.com> wrote:
>>>> Hi,
>>>> 
>>>> When i run the spark locally on windows it gives below hadoop library
>>>> error.
>>>> I am using below spark version.
>>>> 
>>>> <dependency>
>>>>            <groupId>org.apache.spark</groupId>
>>>>            <artifactId>spark-core_2.10</artifactId>
>>>>            <version>1.4.1</version>
>>>>        </dependency>
>>>> 
>>>> 
>>>> 2015-08-04 12:22:23,463  WARN
>>>> (org.apache.hadoop.util.NativeCodeLoader:62) -
>>>> Unable to load native-hadoop library for your platform... using
>>>> builtin-java
>>>> classes where applicable
>>>> 
>>>> Tried to find it on internet but not able to find exact root cause.
>>>> Please let me know what is it, why it is giving warning and how can i
>>>> resolve it.
>>>> 
>>>> Thanks,
>>>> Deepesh
>> 
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to