I am not sure, i haven't used it that way.

I know it works fully distributed AND when embedded with local job
tracker (e.g. its tests are basically MR jobs with "local" job
tracker) which probably is not the same as Mahout local mode.  "local"
job tracker is not good for much though: thus it doesn't use even
multicore parallelism as it doesn't support multiple reducers, so this
code is kind of for a real cluster really, pragmatically. There's also
Ted's implementation of non-distributed SSVD in Mahout which does not
require Hadoop dependencies but it is a different api with no PCA
option (not sure about power iterations).

I am not sure why this very particular error appears in your setup.

On Fri, Aug 31, 2012 at 3:02 PM, Pat Ferrel <[email protected]> wrote:
> Running on the local file system inside IDEA with MAHOUT_LOCAL set and 
> performing an SSVD I get the error below. Notice that R-m-00000 exists in the 
> local file system and running it outside the debugger in pseudo-cluster mode 
> with HDFS works. Does SSVD work in local mode?
>
> java.io.FileNotFoundException: File 
> /tmp/hadoop-pat/mapred/local/archive/5543644668644532045_1587570556_2120541978/file/Users/pat/Projects/big-data/b/ssvd/Q-job/R-m-00000
>  does not exist.
>
> Maclaurin:big-data pat$ ls -al b/ssvd/Q-job/
> total 72
> drwxr-xr-x  10 pat  staff   340 Aug 31 13:35 .
> drwxr-xr-x   4 pat  staff   136 Aug 31 13:35 ..
> -rw-r--r--   1 pat  staff    80 Aug 31 13:35 .QHat-m-00000.crc
> -rw-r--r--   1 pat  staff    28 Aug 31 13:35 .R-m-00000.crc
> -rw-r--r--   1 pat  staff     8 Aug 31 13:35 ._SUCCESS.crc
> -rw-r--r--   1 pat  staff    12 Aug 31 13:35 .part-m-00000.deflate.crc
> -rwxrwxrwx   1 pat  staff  9154 Aug 31 13:35 QHat-m-00000
> -rwxrwxrwx   1 pat  staff  2061 Aug 31 13:35 R-m-00000
> -rwxrwxrwx   1 pat  staff     0 Aug 31 13:35 _SUCCESS
> -rwxrwxrwx   1 pat  staff     8 Aug 31 13:35 part-m-00000.deflate
>

Reply via email to