Hadoop has its own guava. This is some dependency clash at runtime, for sure. Other than that no idea. MR is being phased out. Why don't u try spark version in upcoming .10.2? On Jun 10, 2015 12:58 PM, "Mihai Dascalu" <[email protected]> wrote:
> Hi! > > After upgrading to Mahout 0.10.1, I have a runtime exception in the > following Hadoop code in which I create the input matrix for performing > SSVD: > > // prepare output matrix > 81: final Configuration conf = new Configuration(); > > 83: SequenceFile.Writer writer = > SequenceFile.createWriter(conf, > Writer.file(new Path(path + "/" + > outputFileName)), > Writer.keyClass(IntWritable.class), > Writer.valueClass(VectorWritable.class)); > > while in the console we have: > ... > [Loaded org.apache.hadoop.util.StringInterner from > file:/Users/mihaidascalu/Dropbox%20(Personal)/Workspace/Eclipse/ReaderBenchDev/lib/Mahout/mahout-mr-0.10.1-job.jar] > ... > java.lang.VerifyError: (class: com/google/common/collect/Interners, > method: newWeakInterner signature: ()Lcom/google/common/collect/Interner;) > Incompatible argument to function > at > org.apache.hadoop.util.StringInterner.<clinit>(StringInterner.java:48) > at > org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2293) > at > org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2185) > at > org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2102) > at org.apache.hadoop.conf.Configuration.get(Configuration.java:851) > at > org.apache.hadoop.io.SequenceFile.getDefaultCompressionType(SequenceFile.java:234) > at > org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:264) > at > services.semanticModels.LSA.CreateInputMatrix.parseCorpus(CreateInputMatrix.java:83) > at > services.semanticModels.LSA.CreateInputMatrix.main(CreateInputMatrix.java:197) > > Any suggestions? I tried adding guava-14.0.1.jar as dependency, but it did > not fix it > > > Thanks and have a great day! > Mihai
