Hello sir,
Please have a look in which the specify .. when i had tried to run the
program i had also faced same issue. There is lot of dependency as you can
verify in pom.xml

shuyo.wordpress.com/2011/02/01/mahout-development-environment-with-maven-and-eclipse-1/

Syed Abdul kather
send from Samsung S3
On Aug 5, 2012 12:56 PM, "Abhinav M Kulkarni" <[email protected]>
wrote:

> Hi Lance,
>
> I run this program from Eclipse. I am actually able to write/read
> underlying HDFS. I have also ran several Hadoop programs from Eclipse. I
> added all the jars using 'Add External JARs' option in Eclipse.
>
> However I just checked that there is no Vector class/interface under
> org.apache.mahout.math, hence NoClassFoundException.
>
> Thanks.
>
> On 08/04/2012 11:47 PM, Lance Norskog wrote:
>
>> How do you run this program? Are you running a Hadoop cluster app out
>> of Eclipse? I do not know if that mode copies your jars out to the
>> executors.
>>
>> On Sat, Aug 4, 2012 at 10:30 PM, Abhinav M Kulkarni
>> <[email protected]> wrote:
>>
>>> I have added all the jars (including hadoop-core and mahout-math) to the
>>> Eclipse project. In fact I added all the jars in mahout installation
>>> directory.
>>>
>>>
>>> On 08/04/2012 09:54 PM, Abhinav M Kulkarni wrote:
>>>
>>>> Hi,
>>>>
>>>> I have a small snippet of code which gives me error. Points written in
>>>> the
>>>> input directory are of the type VectorWritable (value), key being
>>>> LongWritable. Clusters written to part-00000 file are of the type
>>>> Kluster
>>>> (value), key being Text. Both these files are of SequenceFile format.
>>>>
>>>> Code snippet:
>>>>
>>>> KMeansDriver.run(conf, new Path("input/points"), new
>>>> Path("input/part-00000"),
>>>>                  new Path("output"), new EuclideanDistanceMeasure(),
>>>> 0.001,
>>>> 10, true, 0, false);
>>>>
>>>> Error:
>>>>
>>>> Error: java.lang.**ClassNotFoundException:
>>>> org.apache.mahout.math.Vector
>>>>      at java.net.URLClassLoader$1.run(**URLClassLoader.java:217)
>>>>      at java.security.**AccessController.doPrivileged(**Native Method)
>>>>      at java.net.URLClassLoader.**findClass(URLClassLoader.java:**205)
>>>>      at java.lang.ClassLoader.**loadClass(ClassLoader.java:**321)
>>>>      at sun.misc.Launcher$**AppClassLoader.loadClass(**
>>>> Launcher.java:294)
>>>>      at java.lang.ClassLoader.**loadClass(ClassLoader.java:**266)
>>>>      at java.lang.Class.forName0(**Native Method)
>>>>      at java.lang.Class.forName(Class.**java:264)
>>>>      at
>>>> org.apache.hadoop.conf.**Configuration.getClassByName(**
>>>> Configuration.java:820)
>>>>      at org.apache.hadoop.io.**WritableName.getClass(**
>>>> WritableName.java:71)
>>>>      at
>>>> org.apache.hadoop.io.**SequenceFile$Reader.**
>>>> getValueClass(SequenceFile.**java:1671)
>>>>      at
>>>> org.apache.hadoop.io.**SequenceFile$Reader.init(**
>>>> SequenceFile.java:1613)
>>>>      at
>>>> org.apache.hadoop.io.**SequenceFile$Reader.<init>(**
>>>> SequenceFile.java:1486)
>>>>      at
>>>> org.apache.hadoop.io.**SequenceFile$Reader.<init>(**
>>>> SequenceFile.java:1475)
>>>>      at
>>>> org.apache.hadoop.io.**SequenceFile$Reader.<init>(**
>>>> SequenceFile.java:1470)
>>>>      at
>>>> org.apache.hadoop.mapreduce.**lib.input.**SequenceFileRecordReader.**
>>>> initialize(**SequenceFileRecordReader.java:**50)
>>>>      at
>>>> org.apache.hadoop.mapred.**MapTask$**NewTrackingRecordReader.**
>>>> initialize(MapTask.java:522)
>>>>      at org.apache.hadoop.mapred.**MapTask.runNewMapper(MapTask.**
>>>> java:763)
>>>>      at org.apache.hadoop.mapred.**MapTask.run(MapTask.java:370)
>>>>      at org.apache.hadoop.mapred.**Child$4.run(Child.java:255)
>>>>      at java.security.**AccessController.doPrivileged(**Native Method)
>>>>      at javax.security.auth.Subject.**doAs(Subject.java:416)
>>>>      at
>>>> org.apache.hadoop.security.**UserGroupInformation.doAs(**
>>>> UserGroupInformation.java:**1121)
>>>>      at org.apache.hadoop.mapred.**Child.main(Child.java:249)
>>>>
>>>> Can someone spot me why this error?
>>>>
>>>
>>>
>>
>>
>

Reply via email to