This sounds a bit dubious to me. Mahout framework should provide a reasonable default here.
My guess is that I discovered that -ow switch was not quite doing how it was intended, so if you ran it multiple times, it might have been a problem. it has been fixed in the latest round of fixes (0.8 trunk). However, if it were a problem, your job would fail before it even launched backend tasks, and i think it would be a different error message. i would still try to look into task logs to see what's happening there. if you dig that error out, i would be able to provide more info. On Thu, Nov 15, 2012 at 4:50 AM, Abramov Pavel <[email protected]>wrote: > I forgot to set --tempDir parameter. > > > Issue Solved, thanks > > > Pavel > > > 15.11.12 15:43 пользователь "Abramov Pavel" <[email protected]> > написал: > > >Hello! > > > >Trying to compute reduced rank matrix for recommendation system over > >users (20*10^6) and their ratings (~150 000 items). > > > >How to avoid the exception with Q job? : > >======================= > >Exception in thread "main" java.io.IOException: Q job unsuccessful. > >at org.apache.mahout.math.hadoop.stochasticsvd.QJob.run(QJob.java:230) > >at > >org.apache.mahout.math.hadoop.stochasticsvd.SSVDSolver.run(SSVDSolver.java > >:377) > >at > >org.apache.mahout.math.hadoop.stochasticsvd.SSVDCli.run(SSVDCli.java:141) > >======================= > > > >CLI parameters are: > >======================= > >mahout ssvd \ > >-i /rec/sparse/tfidf-vectors/ \ > >-o /rec/ssvd \ > >-k 100 \ > >--reduceTasks 100 \ > >-ow > >======================= > >Using Mahout 0.7 on Hadoop with ~50 nodes (400 Mappers/300 reducers). > > > >My Input for SSVD is seq2sparse output, and there are 200 "Key class: > >class org.apache.hadoop.io.Text Value Class: class > >org.apache.mahout.math.VectorWritable" sequence files. 8GB total. > > > >Many thanks in advance, any suggestion is highly appreciated. I Don't > >know what to do, CF produces inaccurate results for my tasks, SVD is the > >only hope )) > > > >Regards, > >Pavel > > > > > > > > > >
