On Jun 12, 2008, at 6:47 AM, montag wrote:
Hi,
I'm a new Hadoop user, so if this question is blatantly obvious, I
apologize. I'm trying to load a native shared library using the
DistributedCache as outlined in
https://issues.apache.org/jira/browse/HADOOP-1660?
page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
HADOOP-1660 .
The DistributedCache will use the 'fragment' of the URI as the name
of the symlink:
hdfs://namenode:port/lib.so.1#lib.so
Thus in the above case you will find:
lib.so -> lib.so.1
Then in your main:
DistributedCache.addCacheFile("hdfs://namenode:port/lib.so.1#lib.so",
conf);
DistributedCache.createSymLink(conf);
In the map/reduce task:
System.loadLibrary("lib.so");
Hope that helps...
Arun
However, when I call System.load() , I continually get an
"UnsatisfiedLinkException: can't load library " error. I've
checked the
java.library.path and LD_LIBRARY_PATH variables, and all seems to
be in
order. I've also tried using System.loadLibrary(), but that call
doesn't
even appear to find the library.
I have a feeling that I'm not properly creating the symlinks to the
library within the DistributedCache. Could someone that has
successfully
loaded a native library using this functionality possibly provide a
code
snippet of how this is done? Currently my code for loading the
library
looks like this:
DistributedCache.addCacheFile(libPath.toUri(), conf);
DistributedCache.setLocalFiles(conf, "lib.so"); //I'm not really
sure if
this is necessary
DistributedCache.createSymlink(conf);
and then within the M/R classes:
Path[] path = DistributedCache.getLocalCacheFiles(conf);
System.load(new File("lib.so").getAbsolutePath());//I've also tried
System.load(path[0].toString) but that didn't work either.
Is this incorrect? Any help would be greatly appreciated.
Thanks,
Mike
--
View this message in context: http://www.nabble.com/Issue-loading-a-
native-library-through-the-DistributedCache-tp17800388p17800388.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.