On Sep 12, 2007, at 4:07 AM, Arun C Murthy wrote:
Does Hadoop have a system or rules to prevent or help such
situations?
(e.g. a config system that prevents a 32 bit binary from running
on a 64bit
node)
No. Not as of today.
Such a framework would make sense for static clusters. We will likely
move more in the direction of using Torque or Condor to dynamically
allocate a cluster with the necessary hardware/OS for a given
application.
Also when compiling C++ map-reduce source codes with the given
build.xml in
version 0.14.1, does it compile in 32 bit mode?
I believe the autoconf file doesn't specify, so it is the default for
the platform. It would make sense to extend the build.xml to include
a variable to pass to the configure scripts when building c++.
-- Owen