Hi,

well honestly I cannot give you an advice on that.

The native libraries are part of the hadoop distributed filesystem.
I think you should ask this question on the hadoop users list, since this is
a question regarding the hadoop dfs.

Things "should" work without these native dependencies as well. However, I
can just ASSUME this. From my point of view it would be better if you simply
include the linux-natives in your product. I don't know about the testing
procedure used for the hadoop project. I don't know how much effort is put
into testing the java-only version of hadoop on linux. On the other hand, as
they provide a version with native bindings for linux, I assume the
java+native bindings is tested better under linux, than the java-only
version

My two cents . . . probably try the hadoop-users mailing list


Best Regards,

Martin


On Jan 18, 2008 10:15 PM, Krishnamohan Meduri <[EMAIL PROTECTED]>
wrote:

> Hi Martin,
>
> I install Apache nutch 0.9. Under lib/native/Linux-i386-32/ directory, I
> see some linux specific shared objects.
>
> Is it completely harmless to remove dependency on them and remove them
> eventually? The reason I am asking is that I am trying to use Apache
> nutch for a product that is supported on Windows,Linux,Solaris,HPUX.
>
> By removing them, I can confidently say that it is all Java and no
> native library dependency. I am willing to sacrifice some optimizations
> done for linux.
>
> Would greatly appreciate if you could shed some light on this.
>
> thanks so much,
> -Krishna.
>

Reply via email to