Ok, you convinced me :) But can you please help me with an example or some 
documentation? I only found fragments of code (and only Scala, not Java Spark).

How should I create the input matrix, invoke dssvd, as well as configure 
processors/ memory?

Also, are there some specific dependencies of versions? Should I wait for the 
next release?


Thanks a lot and have a great day!
Mihai

> On Jun 10, 2015, at 23:57, Dmitriy Lyubimov <[email protected]> wrote:
> 
> Hadoop has its own guava. This is some dependency clash at runtime, for
> sure. Other than that no idea. MR is being phased out. Why don't u try
> spark version in upcoming .10.2?
> On Jun 10, 2015 12:58 PM, "Mihai Dascalu" <[email protected]> wrote:
> 
>> Hi!
>> 
>> After upgrading to Mahout 0.10.1, I have a runtime exception in the
>> following Hadoop code in which I create the input matrix for performing
>> SSVD:
>> 
>> // prepare output matrix
>> 81:             final Configuration conf = new Configuration();
>> 
>> 83:             SequenceFile.Writer writer =
>> SequenceFile.createWriter(conf,
>>                                Writer.file(new Path(path + "/" +
>> outputFileName)),
>>                                Writer.keyClass(IntWritable.class),
>>                                Writer.valueClass(VectorWritable.class));
>> 
>> while in the console we have:
>> ...
>> [Loaded org.apache.hadoop.util.StringInterner from
>> file:/Users/mihaidascalu/Dropbox%20(Personal)/Workspace/Eclipse/ReaderBenchDev/lib/Mahout/mahout-mr-0.10.1-job.jar]
>> ...
>> java.lang.VerifyError: (class: com/google/common/collect/Interners,
>> method: newWeakInterner signature: ()Lcom/google/common/collect/Interner;)
>> Incompatible argument to function
>>        at
>> org.apache.hadoop.util.StringInterner.<clinit>(StringInterner.java:48)
>>        at
>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2293)
>>        at
>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2185)
>>        at
>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2102)
>>        at org.apache.hadoop.conf.Configuration.get(Configuration.java:851)
>>        at
>> org.apache.hadoop.io.SequenceFile.getDefaultCompressionType(SequenceFile.java:234)
>>        at
>> org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:264)
>>        at
>> services.semanticModels.LSA.CreateInputMatrix.parseCorpus(CreateInputMatrix.java:83)
>>        at
>> services.semanticModels.LSA.CreateInputMatrix.main(CreateInputMatrix.java:197)
>> 
>> Any suggestions? I tried adding guava-14.0.1.jar as dependency, but it did
>> not fix it
>> 
>> 
>> Thanks and have a great day!
>> Mihai

Reply via email to