Seems like your pig is not finding the Hadoop configuration files in
its classpath when it is firing.
Assuming that you have installed Hadoop-0.20.0 somewhere in your local
fileSystem say <hadoop-20 installation directory>, please add the
following to your classpath
export HADOOPDIR=<hadoop installation directory>/conf
export PIG_CLASSPATH=$PIG_HOME/pig.jar:$HADOOPDIR
Thanks, that seems to have worked. It never occured to me that the
configuration directory should go in the CLASSPATH.
This should solve you problem if all your configuration have been done
as per instructions in pig wiki
However, I am a bit confused about configuration files. I can't find
anything on it in the pig wiki about it. (I'm having trouble navigating
it, really.) What documentation there is, always mentions
$PIG_HOME/conf, which doesn't seem to exist in Pig_0.5.0. Neither does
$HADOOP_HOME/conf/hadoop-site.xml seem to exist in hadoop_20.1, which
the hadoop wiki specifically mentions.
Anyway, I'm getting this error from pig:
2009-11-24 07:11:46,585 [main] ERROR
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
- Failed to produce result in:
"hdfs://tuson118:9100/tmp/temp2106131810/tmp-1474283808"
This makes sense because there is no /tmp in my hdfs mount. Everything
starts from /data/pig/dfs, even inside hadoop. I assume this is some
sort of configuration issue (shouldn't the hdfs internally start from
'/'?). The documentation all seems to refer to earlier version of hadoop.
Jim