Hi Nairan,

HDFS doesn't normally run on top of Mesos, and we generally expect people to 
have only one instance of HDFS, which multiple instances of MapReduce (or other 
frameworks) would share. If you want two instances of HDFS, you need to set 
them up manually, and configure them to use different ports. Here are the 
Hadoop settings you need to change:

fs.default.name  (contains port of NameNode)
dfs.http.address  (web UI of NameNode)
dfs.datanode.address
dfs.datanode.ipc.address
dfs.datanode.http.address
dfs.secondary.http.address
dfs.name.dir
dfs.data.dir

We actually do this in our EC2 scripts 
(https://github.com/mesos/mesos/wiki/EC2-Scripts), which will launch a Mesos 
cluster with both a "persistent" and an "ephemeral" HDFS for you. You might 
take a look at how that gets configured.

Matei

On May 6, 2012, at 7:04 PM, Nairan Zhang wrote:

> Hi,
> 
> It seems it disallows to have the second datanode in one machine. Is it a
> common problem? Anybody can please help me out a little bit? Thanks,
> 
> Nairan

Reply via email to