On Wed, Sep 12, 2007 at 07:03:44PM +0900, Taeho Kang wrote:
>Hello all,
>
>Has anyone been in a situation where hardwares in a hadoop cluster are
>different when running non-Java binaries on Hadoop?
>e.g. a c++ map-reduce binary was compiled on a 32-bit CPU, but Hadoop
>cluster consists of both 32 and 64 bit machines.
>My guess is that the result from this situation will be a big mess.
>

Depends. Does your cluster have the same OS? 
If so, I don't see a problem running a 32-bit linux executable on a 64-bit 
machine, but not vice-versa.

>Does Hadoop have a system or rules to prevent or help such situations?
>(e.g. a config system that prevents a 32 bit binary from running on a 64bit
>node)
>

No. Not as of today.

>Also when compiling C++ map-reduce source codes with the given build.xml in
>version 0.14.1, does it compile in 32 bit mode?
>

Not sure, I don't see it. I'll let Owen comment on this one...

Arun

>Any comments and ideas will be appreciated. Thank you in advance.
>
>Regards,
>Taeho
>
>-- 
>Taeho Kang [tkang.blogspot.com]

Reply via email to