Doug Cutting wrote:
Sylvain Wallez wrote:
The only reason I can see for the change you suggest (restricting
daemons to only listen on a single interface) is security: you'd like
these daemons to not be visible over the private address. Is that
indeed your concern?
Exactly!
This sounds
Doug Cutting wrote:
Sylvain Wallez wrote:
Philippe's question is related to machines with multiples interfaces
(e.g. one public-facing interface and another one for a private
network). We'd like to bind Hadoop's sockets to the private interface,
so that only machines on the private network
Sylvain Wallez wrote:
I don't know Hadoop's internals well, but it seems to me that an
additional configuration could do the trick, e.g.
String itfAddr = conf.getString(ipc.server.listen.address)
address = (itfAddr == null) ? new InetSocketAddress(port) : new
InetSocketAddress(itfAddr,
Check out this url
http://lucene.apache.org/hadoop/docs/api/overview-summary.html
--Jugs
-Original Message-
From: Philippe Gassmann [mailto:[EMAIL PROTECTED]
Sent: Friday, September 08, 2006 2:37 PM
To: hadoop-user@lucene.apache.org
Subject: Executing hadoop binded on localhost
Hi, is
Jagadeesh wrote:
Check out this url
http://lucene.apache.org/hadoop/docs/api/overview-summary.html
That does not solve my issue : when you specify localhost in
hadoop-site.xml, hadoop does not bind to localhost but to 0.0.0.0.
In the source tree I can see : new ServerSocket(port) to
Doug Cutting wrote:
Perhaps you need to add an entry for 'localhost' in the hosts file on
your machine? My linux /etc/hosts has an entry like:
127.0.0.1 localhost
Alternately you could specify '127.0.0.1' as the host instead of
'localhost'. That should work, since 127.0.0.1 always