Hello!

I'm a newbie with Hadoop and I'm just learning to set it up for a class
project. I followed all the steps in the yahoo tutorial and I'm running
hadoop on a virtual machine in VMplayer. I'm running Windows 7 on the host
machine. I'm stuck at trying to access the Hadoop HDFS from Eclipse! I've
done everything possible and I've tried everything recommended by people in
other threads.

Whenever i try to connect to the VM from Eclipse it gives me the following
error under DFS locations tab in the Eclipse Project Explorer:
Error: Call to /192.168.64.128:9000 failed on local exception
java.io.EOFException


I'm launching eclipse from Cygwin. I changed the first entry in the
hadoop.job.ugi field to "hadoop-user". I also changed the mapred.system.dir
field to "/hadoop/mapred/system". I also tried changing this field to be
"/tmp/hadoop-user/mapred/system" but it still gives the same error when I
try to reconnect. Other fields in the advanced tab like the field
dfs.data.dir have value "tmp/hadoop-user/dfs/data".

Can someone please help me out? I'm really stuck.

Thanks in advance!
-- 
View this message in context: 
http://old.nabble.com/Hadoop-eclipse-java.io.EOFexception-tp28287957p28287957.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to