Hi Prashant, Read this thread: http://comments.gmane.org/gmane.comp.jakarta.lucene.hadoop.user/25837
Thanks, Tejas On Wed, Nov 14, 2012 at 10:14 AM, Lewis John Mcgibbney < [email protected]> wrote: > Hi Prashant, > > Please take a look on either the Nutch or the Hadoop user@ lists. I've > seen and reported on this previously so it should not be too hard to > find. > > hth > > Lewis > > On Wed, Nov 14, 2012 at 6:07 PM, Prashant Ladha > <[email protected]> wrote: > > Hi, > > I am trying to setup Nutch via Eclipse. > > I followed the instructions from the below link. > > http://wiki.apache.org/nutch/RunNutchInEclipse > > > > But when running the Crawl class, it is throwing the below exception. > > I tried the solution to add cygwin in your PATH mentioned on the below > link > > but that did not help. > > http://florianhartl.com/nutch-installation.html > > > > I am using Windows7 laptop. > > > > Can you please help me resolving this issue? > > > > > > solrUrl is not set, indexing will be skipped... > > crawl started in: crawl > > rootUrlDir = urls > > threads = 10 > > depth = 3 > > solrUrl=null > > topN = 50 > > Injector: starting at 2012-11-14 23:34:18 > > Injector: crawlDb: crawl/crawldb > > Injector: urlDir: urls > > Injector: Converting injected urls to crawl db entries. > > Exception in thread "main" java.io.IOException: Failed to set permissions > > of path: \tmp\hadoop-XXXXXX\mapred\staging\XXXXXXXXX916119234\.staging to > > 0700 > > at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689) > > at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662) > > at > > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509) > > at > > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344) > > at > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189) > > at > > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > > at > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) > > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824) > > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261) > > at org.apache.nutch.crawl.Injector.inject(Injector.java:278) > > at org.apache.nutch.crawl.Crawl.run(Crawl.java:127) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at org.apache.nutch.crawl.Crawl.main(Crawl.java:55) > > > > -- > Lewis >

