OK, I am using Macbook.
After removing META-INF/LICENSE I was able to run it and got following
errors.

So does not mahout work with hadoop 2.0 ? or there is a quick work around
for this ?

Puneet:mahout pjaisw1$ ./bin/mahout
org.apache.mahout.clustering.syntheticcontrol.kmeans.Job
Running on hadoop, using /app/hadoop/bin/hadoop and HADOOP_CONF_DIR=
MAHOUT-JOB:
/opt/openSrc/mahout/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar
12/12/20 00:18:31 WARN driver.MahoutDriver: No
org.apache.mahout.clustering.syntheticcontrol.kmeans.Job.props found on
classpath, will use command-line arguments only
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.hadoop.util.ProgramDriver.driver([Ljava/lang/String;)V
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)


Thanks,
-Puneet

On Thu, Dec 20, 2012 at 12:11 AM, Puneet Jaiswal <[email protected]> wrote:

> Hi,
>
> I am a newbie with Mahout.
> I have set up hadoop 2.0 and I am trying to run and example described here
> https://cwiki.apache.org/MAHOUT/clustering-of-synthetic-control-data.html
>
> I have copied testdata to HDFS.
>
> Puneet:mahout pjaisw1$ hadoop fs -ls testdata
> Found 1 items
> -rw-r--r--   1 pjaisw1 supergroup     288374 2012-12-19 23:48
> testdata/synthetic_control.data
>
> I get following error:
>
> Puneet:mahout pjaisw1$ ./bin/mahout
> org.apache.mahout.clustering.syntheticcontrol.kmeans.Job
> Running on hadoop, using /app/hadoop/bin/hadoop and HADOOP_CONF_DIR=
> MAHOUT-JOB:
> /opt/openSrc/mahout/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar
> Exception in thread "main" java.io.IOException: Mkdirs failed to create
> /tmp/hadoop-pjaisw1/hadoop-unjar221196643100499467/META-INF/license
> at org.apache.hadoop.util.RunJar.ensureDirectory(RunJar.java:111)
>  at org.apache.hadoop.util.RunJar.unJar(RunJar.java:87)
> at org.apache.hadoop.util.RunJar.unJar(RunJar.java:64)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:184)
>
> Puneet:mahout pjaisw1$ hadoop fs -ls /
> Found 2 items
> drwxrwxrwx   - pjaisw1 supergroup          0 2012-12-20 00:07 /tmp
> drwxr-xr-x   - pjaisw1 supergroup          0 2012-12-19 23:40 /user
>
> I had already ran these commands to avoid write failure to /tmp
>
> Puneet:mahout pjaisw1$ hadoop fs -chmod o+w /tmp
> Puneet:mahout pjaisw1$ hadoop fs -chmod -R o+w /tmp
>
> Any idea why I am getting this error ?
>
> Thanks,
> -Puneet
>

Reply via email to