2009/4/30 He Yongqiang <heyongqi...@software.ict.ac.cn>:
> put your .so file in every traker's Hadoop-install/lib/native/Linux-xxx-xx/
>
> Or
>
> In your code,try to do
>
>  String oldPath=System.getProperty("java.library.path");
>  System.setProperty("java.library.path", oldPath==null?
> local_path_of_lib_file:oldPath+pathSeparator +local_path_of_lib_file))
>  System.loadLibrary("XXX");
>


I have copied .so and .a files to Hadoop-install/lib/native/Linux-xxx-xx/
and called  System.loadLibrary("XXX"); in my codes, but nothing happens.

Then, I tried the second solution mentioned above, same problem is
occurred (the .so files have been in native directory).



> However, you also need to fetch the library to local through
> DistributedCache( like jason said) or putting and getting it from hdfs by
> yourself.
>

Does I need to copy libraries in local machine since I run the Hadoop in
single node?

How can I do it either by fetching or putting from hdfs?


> On 09-4-30 下午5:14, "Ian jonhson" <jonhson....@gmail.com> wrote:
>
>> You mean that the current hadoop does not support JNI calls, right?
>> Are there any solution to achieve the calls from C interfaces?
>>
>> 2009/4/30 He Yongqiang <heyongqi...@software.ict.ac.cn>:
>>> Does hadoop now support jni calls in Mappers or Reducers? If yes, how? If
>>> not, I think we should create a jira issue for supporting that.
>>>
>>>

Reply via email to