Daniel Blaisdell wrote:
I ran into some similar issues with firewalls and ended up completely
turning them off. That took care of some of the problems but allowed me to
figure out that if DNS / HOST files aren't configured correctly, weird
things will happen during the communication between
I ran into some similar issues with firewalls and ended up completely
turning them off. That took care of some of the problems but allowed me to
figure out that if DNS / HOST files aren't configured correctly, weird
things will happen during the communication between daemons. I have a small
Thank you, first tried the put from the master machine, which leads to
the error. The put from the slave machine works. Guess youre right with
the configuration parameters. Appears a bit strange to me, because the
firewall settings and the hadoop-site.xml on both machines are equal.
On Tue,
Got a similar error when doing a mapreduce job on the master machine.
Mapping job is ok and in the end there are the right results in my
output folder, but the reduce hangs at 17% a very long time. Found this
in one of the task logs a view times:
...
2008-06-18 17:31:02,297 INFO
hi,
i'm new in hadoop and im just testing it at the moment.
i set up a cluster with 2 nodes and it seems like they are running
normally,
the log files of the namenode and the datanodes dont show errors.
Firewall should be set right.
but when i try to upload a file to the dfs i get following
Looks like the client machine from which you call -put cannot connect to the
data-nodes.
It could be firewall or wrong configuration parameters that you use for the
client.
Alexander Arimond wrote:
hi,
i'm new in hadoop and im just testing it at the moment.
i set up a cluster with 2 nodes
!
--
View this message in context:
http://www.nabble.com/dfs-put-fails-tp17799906p17799906.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.