thank you,  bing !

2011/8/7 bing wang <[email protected]>

> I had a similar problem when used mahout-0.5 to run PFP-growth. It's a
> classpth issue has been fixed in mahout-trunk. See 
> MAHOUT-680<https://issues.apache.org/jira/browse/MAHOUT-680>&
> MAHOUT-727 <https://issues.apache.org/jira/browse/MAHOUT-727>
>
>
> 2011/8/6 air <[email protected]>
>
>> thank you for all your replies, I compare the 0.5 classpath with the 0.4
>> classpath, found that, 0.4 only contains the
>> /usr/local/mahout-distribution-0.4/conf in the hadoopclasspath, but 0.5, all
>> jars in the hadoopclasspath,
>>
>> I modified the mahout script in MAHOUT_HOME/bin like this:
>>
>> FROM
>>     export
>> HADOOP_CLASSPATH=$MAHOUT_CONF_DIR:${HADOOP_CLASSPATH}:$CLASSPATH
>> TO
>>     export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$MAHOUT_CONF_DIR
>>
>>
>> 0.5 works now, really a strange problem ....
>>
>> thank you again for your  help
>>
>>
>>
>> 2011/8/5 Ted Dunning <[email protected]>
>>
>>> Some other systems such as hbase actually provide a classpath sub-command
>>> that dumps out the classpath with hints about how it is that way.
>>>
>>> That can help enormously in debugging.
>>>
>>>
>>> On Fri, Aug 5, 2011 at 11:30 AM, Sean Owen <[email protected]> wrote:
>>>
>>>> Like I said, it's a classpath issue. The script likely changed between
>>>> 0.4 and 0.5, so you need to look at what its classpath is, where it's
>>>> looking for jars and understand why it is not finding what you need in your
>>>> environment.
>>>> It works out of the box, so I think it is an issue specific to you.
>>>> Unless you can provide more info, that's about all one can say.
>>>>
>>>>
>>>> On Fri, Aug 5, 2011 at 4:28 PM, air <[email protected]> wrote:
>>>>
>>>>> but, what should I do now , it is a very strange problem, 0.4 works
>>>>> well, but 0.5 does not ....[?]
>>>>>  2011/8/5 Sean Owen <[email protected]>
>>>>>
>>>>>> OK, well the underlying problem remains that it cannot find the
>>>>>> mahout-math
>>>>>> jar file in the classpath. It's not a problem with 0.5 per se, but
>>>>>> something
>>>>>> to do with finding the jars.
>>>>>>
>>>>>> On Fri, Aug 5, 2011 at 12:52 PM, air <[email protected]> wrote:
>>>>>>
>>>>>> > hi, Sean, for 0.4 and 0.5, I use the same settings ,in fact no
>>>>>> settings, I
>>>>>> > just download them and unpack them, and then use them to do some
>>>>>> test...
>>>>>> >
>>>>>> > anyone meet this kind of problem ? thank you .
>>>>>> >
>>>>>> >
>>>>>> > 2011/8/5 Sean Owen <[email protected]>
>>>>>> >
>>>>>> > > I think you have changed your classpath to use 0.5, and in your
>>>>>> new
>>>>>> > > classpath, you are not including all the jars. You probably need
>>>>>> core,
>>>>>> > > math,
>>>>>> > > and collections at minimum.
>>>>>> > >
>>>>>> > > On Fri, Aug 5, 2011 at 12:40 PM, air <[email protected]> wrote:
>>>>>> > >
>>>>>> > > > when I use mahout 0.4 to execute on a hadoop cluster:
>>>>>> > > > *
>>>>>> > > > ./mahout
>>>>>> org.apache.mahout.clustering.syntheticcontrol.canopy.Job
>>>>>> > --input
>>>>>> > > > /mahout/input  --output /mahout/output13 -t1 80.0 -t2 50.0*
>>>>>> > > >
>>>>>> > > > it works very well......
>>>>>> > > >
>>>>>> > > > but when I use mahout 0.5 to execute *the same command(if
>>>>>> omitting the
>>>>>> > > > output directory...):*
>>>>>> > > >
>>>>>> > > > *./mahout
>>>>>> org.apache.mahout.clustering.syntheticcontrol.canopy.Job
>>>>>> > > --input
>>>>>> > > > /mahout/input  --output /mahout/output14 -t1 80.0 -t2 50.0*
>>>>>> > > >
>>>>>> > > > it reports CLASSNOTFOUND exception, what cause this happen, how
>>>>>> to
>>>>>> > solve
>>>>>> > > it
>>>>>> > > > ... I am really confused....:
>>>>>> > > >
>>>>>> > > >
>>>>>> > > > [root@x06 bin]# ./mahout
>>>>>> > > > org.apache.mahout.clustering.syntheticcontrol.canopy.Job --input
>>>>>> > > > /mahout/input  --output /mahout/output15 -t1 80.0 -t2 50.0
>>>>>> > > > Running on hadoop, using HADOOP_HOME=/usr/lib/hadoop
>>>>>> > > > No HADOOP_CONF_DIR set, using /usr/lib/hadoop/src/conf
>>>>>> > > > 11/08/05 19:36:43 WARN driver.MahoutDriver: No
>>>>>> > > > org.apache.mahout.clustering.syntheticcontrol.canopy.Job.props
>>>>>> found on
>>>>>> > > > classpath, will use command-line arguments only
>>>>>> > > > 11/08/05 19:36:43 INFO canopy.Job: Running with only
>>>>>> user-supplied
>>>>>> > > > arguments
>>>>>> > > > 11/08/05 19:36:43 INFO common.AbstractJob: Command line
>>>>>> arguments:
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> {--distanceMeasure=org.apache.mahout.common.distance.SquaredEuclideanDistanceMeasure,
>>>>>> > > > --endPhase=2147483647, --input=/mahout/input,
>>>>>> > --output=/mahout/output15,
>>>>>> > > > --startPhase=0, --t1=80.0, --t2=50.0, --tempDir=temp}
>>>>>> > > > 11/08/05 19:37:09 WARN mapred.JobClient: Use
>>>>>> GenericOptionsParser for
>>>>>> > > > parsing the arguments. Applications should implement Tool for
>>>>>> the same.
>>>>>> > > > 11/08/05 19:37:43 INFO input.FileInputFormat: Total input paths
>>>>>> to
>>>>>> > > process
>>>>>> > > > :
>>>>>> > > > 1
>>>>>> > > > 11/08/05 19:37:43 INFO lzo.GPLNativeCodeLoader: Loaded native
>>>>>> gpl
>>>>>> > library
>>>>>> > > > 11/08/05 19:37:43 INFO lzo.LzoCodec: Successfully loaded &
>>>>>> initialized
>>>>>> > > > native-lzo library [hadoop-lzo rev
>>>>>> > > > 2991c7d00d4b50494958eb1c8e7ef086d24853ea]
>>>>>> > > > 11/08/05 19:37:54 INFO mapred.JobClient: Running job:
>>>>>> > > job_201108041755_7040
>>>>>> > > > 11/08/05 19:37:55 INFO mapred.JobClient:  map 0% reduce 0%
>>>>>> > > > 11/08/05 19:38:19 INFO mapred.JobClient:  map 100% reduce 0%
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient: Job complete:
>>>>>> > > > job_201108041755_7040
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient: Counters: 13
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:   Job Counters
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:
>>>>>> SLOTS_MILLIS_MAPS=4911
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Total time spent by
>>>>>> all
>>>>>> > > > reduces
>>>>>> > > > waiting after reserving slots (ms)=0
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Total time spent by
>>>>>> all
>>>>>> > maps
>>>>>> > > > waiting after reserving slots (ms)=0
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Launched map
>>>>>> tasks=1
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Data-local map
>>>>>> tasks=1
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:
>>>>>> SLOTS_MILLIS_REDUCES=4017
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:   FileSystemCounters
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:
>>>>>> HDFS_BYTES_READ=288490
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:
>>>>>> FILE_BYTES_WRITTEN=49166
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:
>>>>>> HDFS_BYTES_WRITTEN=335470
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:   Map-Reduce Framework
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Map input
>>>>>> records=600
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Spilled Records=0
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     Map output
>>>>>> records=600
>>>>>> > > > 11/08/05 19:38:21 INFO mapred.JobClient:     SPLIT_RAW_BYTES=116
>>>>>> > > > 11/08/05 19:38:21 INFO canopy.CanopyDriver: Build Clusters
>>>>>> Input:
>>>>>> > > > /mahout/output15/data Out: /mahout/output15 Measure:
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.common.distance.SquaredEuclideanDistanceMeasure@4dc1c92bt1
>>>>>> > > > :
>>>>>> > > > 80.0 t2: 50.0
>>>>>> > > > 11/08/05 19:38:21 WARN mapred.JobClient: Use
>>>>>> GenericOptionsParser for
>>>>>> > > > parsing the arguments. Applications should implement Tool for
>>>>>> the same.
>>>>>> > > > 11/08/05 19:38:26 INFO input.FileInputFormat: Total input paths
>>>>>> to
>>>>>> > > process
>>>>>> > > > :
>>>>>> > > > 1
>>>>>> > > > 11/08/05 19:38:31 INFO mapred.JobClient: Running job:
>>>>>> > > job_201108041755_7056
>>>>>> > > > 11/08/05 19:38:32 INFO mapred.JobClient:  map 0% reduce 0%
>>>>>> > > > 11/08/05 19:38:46 INFO mapred.JobClient: Task Id :
>>>>>> > > > attempt_201108041755_7056_m_000000_0, Status : FAILED
>>>>>> > > > Error: java.lang.ClassNotFoundException:
>>>>>> org.apache.mahout.math.Vector
>>>>>> > > >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>> > > >        at java.security.AccessController.doPrivileged(Native
>>>>>> Method)
>>>>>> > > >        at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>> > > >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>> > > >        at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>> > > >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>> > > >        at java.lang.Class.forName0(Native Method)
>>>>>> > > >        at java.lang.Class.forName(Class.java:247)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1020)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobConf.getMapOutputValueClass(JobConf.java:747)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:819)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:557)
>>>>>> > > >        at
>>>>>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:639)
>>>>>> > > >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>>>>>> > > >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>>>>>> > > >        at java.security.AccessController.doPrivileged(Native
>>>>>> Method)
>>>>>> > > >        at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>>>>>> > > >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>>>>>> > > >
>>>>>> > > > 11/08/05 19:38:52 INFO mapred.JobClient: Task Id :
>>>>>> > > > attempt_201108041755_7056_m_000000_1, Status : FAILED
>>>>>> > > > Error: java.lang.ClassNotFoundException:
>>>>>> org.apache.mahout.math.Vector
>>>>>> > > >        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>> > > >        at java.security.AccessController.doPrivileged(Native
>>>>>> Method)
>>>>>> > > >        at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>> > > >        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>> > > >        at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>> > > >        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>> > > >        at java.lang.Class.forName0(Native Method)
>>>>>> > > >        at java.lang.Class.forName(Class.java:247)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1020)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.JobConf.getMapOutputValueClass(JobConf.java:747)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:819)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:557)
>>>>>> > > >        at
>>>>>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:639)
>>>>>> > > >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>>>>>> > > >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>>>>>> > > >        at java.security.AccessController.doPrivileged(Native
>>>>>> Method)
>>>>>> > > >        at javax.security.auth.Subject.doAs(Subject.java:396)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>>>>>> > > >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>>>>>> > > >
>>>>>> > > > 11/08/05 19:39:00 INFO mapred.JobClient: Task Id :
>>>>>> > > > attempt_201108041755_7056_m_000000_2, Status : FAILED
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient: Job complete:
>>>>>> > > > job_201108041755_7056
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient: Counters: 8
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:   Job Counters
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:
>>>>>> SLOTS_MILLIS_MAPS=7530
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Total time spent by
>>>>>> all
>>>>>> > > > reduces
>>>>>> > > > waiting after reserving slots (ms)=0
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Total time spent by
>>>>>> all
>>>>>> > maps
>>>>>> > > > waiting after reserving slots (ms)=0
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Rack-local map
>>>>>> tasks=3
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Launched map
>>>>>> tasks=4
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Data-local map
>>>>>> tasks=1
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:     Failed map tasks=1
>>>>>> > > > 11/08/05 19:39:12 INFO mapred.JobClient:
>>>>>> SLOTS_MILLIS_REDUCES=2464
>>>>>> > > > Exception in thread "main" java.lang.InterruptedException:
>>>>>> Canopy Job
>>>>>> > > > failed
>>>>>> > > > processing /mahout/output15/data
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.canopy.CanopyDriver.buildClustersMR(CanopyDriver.java:355)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.canopy.CanopyDriver.buildClusters(CanopyDriver.java:246)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.canopy.CanopyDriver.run(CanopyDriver.java:143)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.canopy.CanopyDriver.run(CanopyDriver.java:161)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.syntheticcontrol.canopy.Job.run(Job.java:92)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.syntheticcontrol.canopy.Job.run(Job.java:127)
>>>>>> > > >        at
>>>>>> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.mahout.clustering.syntheticcontrol.canopy.Job.main(Job.java:49)
>>>>>> > > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>>> Method)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> > > >        at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>>>>>> > > >        at
>>>>>> > > org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:187)
>>>>>> > > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>>> Method)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> > > >        at
>>>>>> > > >
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> > > >        at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> > > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
>>>>>> > > > [root@x06 bin]#
>>>>>> > > > --
>>>>>> > > > Knowledge Mangement .
>>>>>> > > >
>>>>>> > >
>>>>>> >
>>>>>> >
>>>>>> >
>>>>>> > --
>>>>>> > Knowledge Mangement .
>>>>>> >
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Knowledge Mangement .
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Knowledge Mangement .
>>
>>
>
>
> --
> Gmail/talk: [email protected]
>
>
>


-- 
Knowledge Mangement .

Reply via email to