I just realize that I have to export this variable, passing the HADOOP_HOME.
Like this for example: export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop After this modification everything works fine. On Wed, Jul 8, 2015 at 7:10 AM, Kidong Lee <[email protected]> wrote: > I have experienced submitting mahout spark job with yarn-client mode like > this: > > bin/mahout spark-itemsimilarity --input /input/part-000 --output > /output --maxSimilaritiesPerItem 20 --master yarn-client > --sparkExecutorMem 8g -D:spark.driver.memory=5g > -D:spark.driver.maxResultSize=3g -D:spark.executor.instances=4 > -D:spark.executor.cores=4 -D:spark.yarn.queue=spark-prod > > > It worked fine for me. > > > - Kidong. > > > > 2015-07-08 3:34 GMT+09:00 Rodolfo Viana <[email protected]>: > > > Hi, > > > > I’m trying to run Mahout 0.10 using Spark 1.1.1 and so far I didn’t have > > any success passing a file on hdfs. My actual problem is when I try to > run > > the example: > > > > bin/mahout spark-itemsimilarity --input hdfs://localhost:9000/input > > --output hdfs://localhost:9000/output > > > > And I’m having this > > < > > > https://drive.google.com/file/d/0BwqKhM_BnSmgcUVzRm1odzhBQk0/view?usp=sharing > > > > > > > I was googling and I found this solution > > > > Configuration configuration = new Configuration(); > > > > FileSystem hdfsFileSystem = FileSystem.get(new URI("hdfs://localhost:9000 > > "),configuration); > > > > > > > http://techidiocy.com/java-lang-illegalargumentexception-wrong-fs-expected-file/ > > > > but I don’t want modify the original code. > > > > Is there any way that I can resolve this problem without have to modify > the > > code? > > > > > > > > > > On Tue, Jul 7, 2015 at 3:28 PM, Dmitriy Lyubimov <[email protected]> > > wrote: > > > > > attachments are not showing up on apache lists. > > > > > > > > > On Tue, Jul 7, 2015 at 10:30 AM, Rodolfo Viana < > > > [email protected] > > > > wrote: > > > > > > > Hi, > > > > > > > > I’m trying to run Mahout 0.10 using Spark 1.1.1 and so far I didn’t > > have > > > > any success passing a file on hdfs. My actual problem is when I try > to > > > run > > > > the example: > > > > > > > > bin/mahout spark-itemsimilarity --input hdfs://localhost:9000/input > > > > --output hdfs://localhost:9000/output > > > > > > > > And I’m having this error: (attach) > > > > > > > > > > > > > > > > I was googling and I found this solution > > > > > > > > Configuration configuration = new Configuration(); > > > > > > > > FileSystem hdfsFileSystem = FileSystem.get(new > > URI("hdfs://localhost:9000 > > > > "),configuration); > > > > > > > > > > > > > > > > > > http://techidiocy.com/java-lang-illegalargumentexception-wrong-fs-expected-file/ > > > > > > > > but I don’t want modify the original code. > > > > > > > > Is there any way that I can resolve this problem without have to > modify > > > > the code? > > > > > > > > -- > > > > Rodolfo de Lima Viana > > > > Undergraduate in Computer Science at UFCG > > > > > > > > > > > > > > > > > > > -- > > Rodolfo de Lima Viana > > Undergraduate in Computer Science at UFCG > > > -- Rodolfo de Lima Viana Undergraduate in Computer Science at UFCG
