Hello all,

Has anyone been in a situation where hardwares in a hadoop cluster are
different when running non-Java binaries on Hadoop?
e.g. a c++ map-reduce binary was compiled on a 32-bit CPU, but Hadoop
cluster consists of both 32 and 64 bit machines.
My guess is that the result from this situation will be a big mess.

Does Hadoop have a system or rules to prevent or help such situations?
(e.g. a config system that prevents a 32 bit binary from running on a 64bit
node)

Also when compiling C++ map-reduce source codes with the given build.xml in
version 0.14.1, does it compile in 32 bit mode?

Any comments and ideas will be appreciated. Thank you in advance.

Regards,
Taeho

-- 
Taeho Kang [tkang.blogspot.com]

Reply via email to