Hi , there are some Windows API calls in there that i will never understand. 
Are there some kinds of symlinks you are working with or whatever they are 
called in Windows? There must be something with Nutch/Hadoop getting access to 
your disk. Check permissions, disk space and whatever you can think of.

M.
 
-----Original message-----
> From:Jamal, Sarfaraz <[email protected]>
> Sent: Monday 11th July 2016 22:46
> To: Nutch help <[email protected]>
> Subject: Running into an Issue
> 
> So I feel I have made some progress on Nutch
> 
> However I am now getting another error which I am having difficulty 
> navigating through:
> 
> bin/nutch inject TestCrawl/crawldb url
> 
> produces this below
> 
> Do you have to run Cygwin under Admin for it to work?
> 
> Injector: Converting injected urls to crawl db entries.
> Exception in thread "main" java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>         at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native 
> Method)
>         at 
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>         at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
>         at 
> org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:173)
>         at 
> org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160)
>         at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
>         at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
>         at 
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
>         at 
> org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
>         at 
> org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
>         at 
> org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
>         at 
> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
>         at 
> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
>         at 
> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
>         at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Unknown Source)
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>         at org.apache.nutch.crawl.Injector.inject(Injector.java:376)
>         at org.apache.nutch.crawl.Injector.run(Injector.java:467)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.nutch.crawl.Injector.main(Injector.java:441)
> 

Reply via email to