Hi Arun,
Thanks for your reply! Yes, I'm trying to load a JNI-based library. I
tried what you suggested, but I'm still receiving an UnsatisfiedLink
exception I noticed that if I use System.loadLibrary("lib.so") I get the
following error:
java.lang.UnsatisfiedLinkError: no lib.so in java.library.path
But if i use System.load(new File("lib.so").getAbsolutePath) :
java.lang.UnsatisfiedLinkError: Can't load library: /local/path/to/lib.so
Thus, the symlink is working properly since the call to getAbsolutePath is
returning the local directory, but it still isn't loading the library.
Am I still loading it into the cache wrong? Do I need to set the
conf.setLocalFiles to the localized directory or the directory in HDFS?
Thanks!
Mike
Arun C Murthy wrote:
>
>
> I assume you are trying to load a JNI-based library (since you refer
> to System.load/System.loadLibrary) ...
>
> I've opened https://issues.apache.org/jira/browse/HADOOP-3547 to fix
> the documentation with better examples.
>
> Arun
>
> On Jun 12, 2008, at 9:01 AM, Arun C Murthy wrote:
>
>>
>> On Jun 12, 2008, at 6:47 AM, montag wrote:
>>
>>>
>>> Hi,
>>>
>>> I'm a new Hadoop user, so if this question is blatantly obvious, I
>>> apologize. I'm trying to load a native shared library using the
>>> DistributedCache as outlined in
>>> https://issues.apache.org/jira/browse/HADOOP-1660?
>>> page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
>>> HADOOP-1660 .
>>
>> The DistributedCache will use the 'fragment' of the URI as the name
>> of the symlink:
>> hdfs://namenode:port/lib.so.1#lib.so
>>
>> Thus in the above case you will find:
>> lib.so -> lib.so.1
>>
>> Then in your main:
>> DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.
>> 1#lib.so", conf);
>> DistributedCache.createSymLink(conf);
>>
>> In the map/reduce task:
>> System.loadLibrary("lib.so");
>>
>> Hope that helps...
>>
>> Arun
>>
>>> However, when I call System.load() , I continually get an
>>> "UnsatisfiedLinkException: can't load library " error. I've
>>> checked the
>>> java.library.path and LD_LIBRARY_PATH variables, and all seems to
>>> be in
>>> order. I've also tried using System.loadLibrary(), but that call
>>> doesn't
>>> even appear to find the library.
>>>
>>> I have a feeling that I'm not properly creating the symlinks to the
>>> library within the DistributedCache. Could someone that has
>>> successfully
>>> loaded a native library using this functionality possibly provide
>>> a code
>>> snippet of how this is done? Currently my code for loading the
>>> library
>>> looks like this:
>>>
>>> DistributedCache.addCacheFile(libPath.toUri(), conf);
>>> DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
>>> sure if
>>> this is necessary
>>> DistributedCache.createSymlink(conf);
>>>
>>> and then within the M/R classes:
>>>
>>> Path[] path = DistributedCache.getLocalCacheFiles(conf);
>>>
>>> System.load(new File("lib.so").getAbsolutePath());//I've also tried
>>> System.load(path[0].toString) but that didn't work either.
>>>
>>> Is this incorrect? Any help would be greatly appreciated.
>>>
>>> Thanks,
>>> Mike
>>> --
>>> View this message in context: http://www.nabble.com/Issue-loading-
>>> a-native-library-through-the-DistributedCache-
>>> tp17800388p17800388.html
>>> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>>>
>>
>
>
>
--
View this message in context:
http://www.nabble.com/Issue-loading-a-native-library-through-the-DistributedCache-tp17800388p17804785.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.