for now i simply rebuild spark 0.8 but changed the snappy dependency, and
this fixed it.

but in general it should be possible for a user to include jars in jobs
that "override" the jars included with spark i would say. however this does
not seem to work.


On Fri, Oct 18, 2013 at 7:39 PM, Koert Kuipers <[email protected]> wrote:

> the issue with snappy is supposedly fixed in snappy-java version 1.1.0-M4.
> so i tried to include that with my job, but no luck. i still get error
> related to "snappy-1.0.5-libsnappyjava.so", which to me seems to indicate
> that the snappy included with spark is on the classpath ahead of the one i
> included with my job. is this the case? how do i put my jars first on the
> spark claspath? this should generally be possible as people can include
> newer versions of jars...
>
>
> On Fri, Oct 18, 2013 at 7:30 PM, Koert Kuipers <[email protected]> wrote:
>
>> woops wrong mailing list
>>
>> ---------- Forwarded message ----------
>> From: Koert Kuipers <[email protected]>
>> Date: Fri, Oct 18, 2013 at 7:29 PM
>> Subject: snappy
>> To: [email protected]
>>
>>
>> the snappy bundled with spark 0.8 is causing trouble on CentOS 5:
>>
>>  java.lang.UnsatisfiedLinkError: /tmp/snappy-1.0.5-libsnappyjava.so:
>> /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.9' not found (required by
>> /tmp/snappy-1.0.5-libsn
>> appyjava.so)
>>
>>
>>
>

Reply via email to