Hi, thanks for your reply. I tried what you suggested but
unfortunately it did not work. Also, my HADOOP_CONF_DIR is not set.
I'm just using the default conf directory.
I've been playing around with this, and it seems the reason it works
ok in Eclipse is that its referencing a hadoop-core.jar I made a while
ago. If I take this out, and give it the hadoop project as a
reference, ie what I am putting in the classpath from the command line
I get the same exception. I don't think I have changed any files in
the project and I can run jobs ok using the bin/hadoop jar
function.I'm rather confused. Any ideas?
Cheers,
Ollie
Quoting Enis Soztutar <[EMAIL PROTECTED]>:
Hi,
You should include files under the conf directory. Since they are not
included in classpath hadoop cannot find the implemetation class for
"file:" FileSystem. Check whether your HADOOP_CONF_DIR is set correct.
Another reason for this exception can be that your HADOOP_CONF_DIR is
pointing to some old hadoop version which donot contain new conf
options like FileSystem implementations.
[EMAIL PROTECTED] wrote:
Hi,
I've got a GUI based program that I'm working on, and I'm trying to
add some funcionality to it where it runs a map reduce job on
hadoop. For the moment I am assuming that anyone who is running the
program will be running it on a machine with a hadoop system
running, although later I would like for them to be able to point
it at a different machine on the network.
So, everything works fine when I run the GUI from eclipse (I also
have a hadoop project in there BTW). However, when I run the GUI
from the command line I get the following exception:
java.io.IOException: No FileSystem for scheme: file
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:157)
at org.apache.hadoop.fs.FileSystem.getNamed(FileSystem.java:119)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:91)
at pipe.dataLayer.calculations.RTAMapRed.launch(RTAMapRed.java:293)
and so on......
The bit of code that causes it is (the offending line is marked ****):
conf = new Configuration();
jobConf = new JobConf(conf, RTAMapRed.class);
jobConf.setJobName("rta");
// turn off speculative execution, because DFS doesn't handle
// multiple writers to the same file.
jobConf.setSpeculativeExecution(false);
jobConf.setInputFormat(EulerSequenceFileInputFormat.class);
jobConf.setOutputKeyClass(ComplexWritable.class);
jobConf.setOutputValueClass(DoubleWritable.class);
jobConf.setOutputFormat(SequenceFileOutputFormat.class);
jobConf.setMapperClass(RTAMapper.class);
jobConf.setReducerClass(RTAReducer.class);
tmpDir = new Path("rtamapred");
inDir = new Path(tmpDir, "in");
outDir = new Path(tmpDir, "out");
FileSystem fileSys = FileSystem.get(jobConf); //****This line here!!
fileSys.delete(tmpDir);
if (!fileSys.mkdirs(inDir)) {
throw new IOException("Mkdirs failed to create " +
inDir.toString());
}
I'm running the GUI with the following script to include my hadoop
installation (its a developer one) on the classpath. I was
wondering if I've missed anything obvious off?
CLASSPATH="${HADOOP_CONF_DIR}"
CLASSPATH=${CLASSPATH}:bin
CLASSPATH=${CLASSPATH}:$JAVA_HOME/lib/tools.jar
CLASSPATH=${CLASSPATH}:/home/ollie/workspace/hadoop/build/classes
CLASSPATH=${CLASSPATH}:/home/ollie/workspace/hadoop/build
CLASSPATH=${CLASSPATH}:$HADOOP_INSTALL/build/test/classes
for f in /home/ollie/workspace/hadoop/lib/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
for f in $HADOOP_INSTALL/lib/jetty-ext/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.log.dir=$HADOOP_LOG_DIR"
HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.log.file=$HADOOP_LOGFILE"
HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.home.dir=$HADOOP_HOME"
HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.id.str=$HADOOP_IDENT_STRING"
HADOOP_OPTS="$HADOOP_OPTS
-Dhadoop.root.logger=${HADOOP_ROOT_LOGGER:-INFO,console}"
if [ "x$JAVA_LIBRARY_PATH" != "x" ]; then
HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH"
fi
exec java $HADOOP_OPTS -classpath "$CLASSPATH" RunGui "$@"
(I've tried it with and without the $HADOOP_OPTS set )
Thanks very much in advance for any help offered and apologies for
the information overload!
Cheers,
Ollie