well reverting back to Java 8.. and it all works now

On Wed, Feb 21, 2018 at 2:06 AM, BlackIce <[email protected]> wrote:

> So, now when using the crawl script.. I get this, and it happens with both
> my Java9 compiled one and the binary distribution:
>
> Injecting seed URLs
> /home/nutch/nutch2/bin/nutch inject searchcrawl//crawldb urls/
> Injector: starting at 2018-02-21 02:02:16
> Injector: crawlDb: searchcrawl/crawldb
> Injector: urlDir: urls
> Injector: Converting injected urls to crawl db entries.
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.hadoop.security.
> authentication.util.KerberosUtil 
> (file:/home/nutch/nutch2/lib/hadoop-auth-2.7.4.jar)
> to method sun.security.krb5.Config.getInstance()
> WARNING: Please consider reporting this to the maintainers of
> org.apache.hadoop.security.authentication.util.KerberosUtil
> WARNING: Use --illegal-access=warn to enable warnings of further illegal
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> Injector: java.lang.NullPointerException
>         at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.
> getBlockIndex(FileInputFormat.java:444)
>         at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.
> getSplits(FileInputFormat.java:413)
>         at org.apache.hadoop.mapreduce.lib.input.DelegatingInputFormat.
> getSplits(DelegatingInputFormat.java:115)
>         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(
> JobSubmitter.java:301)
>         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(
> JobSubmitter.java:318)
>         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> JobSubmitter.java:196)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
>         at java.base/java.security.AccessController.doPrivileged(Native
> Method)
>         at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1746)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.
> java:1308)
>         at org.apache.nutch.crawl.Injector.inject(Injector.java:417)
>         at org.apache.nutch.crawl.Injector.run(Injector.java:563)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.nutch.crawl.Injector.main(Injector.java:528)
>
> Error running:
>   /home/nutch/nutch2/bin/nutch inject searchcrawl//crawldb urls/
> Failed with exit value 255.
>
>
> I'm thinking about going back to Java 8 for now ...   but Its probably
> something silly due to it being late....
>
> On Wed, Feb 21, 2018 at 1:37 AM, BlackIce <[email protected]> wrote:
>
>> I commented out the date and now after a whole lot of warnings it says
>> Build Successful
>>
>> I'm gonna take it for a short spin before I set up solr...
>>
>>
>>
>> On Wed, Feb 21, 2018 at 12:01 AM, Markus Jelsma <
>> [email protected]> wrote:
>>
>>> Hello,
>>>
>>> Well, this is interesting! Have you tried Java 8 instead? I don“t think
>>> 9 should cause these kinds of problems but i haven't tried it yet, but
>>> would like to know anyway.
>>>
>>> Regarding commenting out the date, try it anyway!
>>>
>>> Regards,
>>> Markus
>>>
>>> -----Original message-----
>>> > From:BlackIce <[email protected]>
>>> > Sent: Tuesday 20th February 2018 22:40
>>> > To: [email protected]
>>> > Subject: Nutch fails to compile...
>>> >
>>> > HI,
>>> >
>>> > got a strange problem.. When trying to run ANT on Nutch 1.14 at first
>>> it gave me an error regarding sonar.. so I commented Sonar out in the
>>> build.xml and now Im left with:
>>> >
>>> > BUILD FAILED
>>> > /home/nutch/nutch/build.xml:79: Unparseable date: "01/25/1971 2:00 pm"
>>> >
>>> > Im on JDK 9 and Ubuntu server 16.04
>>> >
>>> > Do I just comment this date out?
>>> >
>>> >
>>> > Thnx
>>> >
>>> > RRK
>>>
>>
>>
>

Reply via email to