I had problems with GCJ jvm also ( I am on CentOS ). It all went away
after I switched to Sun JVM.


On Fri, 2008-06-20 at 12:38 -0700, Winton Davies wrote:
> Is this article?  I'm on Fedora Core 8 with yum installed java-1.5.0, 
> and had exactly the same error as the author mentions. If so, anyone 
> have any suggestions about what yum package I should use.
> 
> "Currently Nutch works only with certified JVM implementations, such 
> as  Sun or IBM or BEA. GCJ is not supported."
> 
> http://mail-archives.apache.org/mod_mbox/lucene-nutch-user/200709.mbox/[EMAIL 
> PROTECTED]
> 
> crawl started in: wikidb
> rootUrlDir = wikiurls
> threads = 5
> depth = 1
> topN = 1
> Injector: starting
> Injector: crawlDb: wikidb/crawldb
> Injector: urlDir: wikiurls
> Injector: Converting injected urls to crawl db entries.
> Exception in thread "main" java.lang.IllegalStateException
>     at java.nio.charset.CharsetEncoder.encode(libgcj.so.8rh)
>     at org.apache.hadoop.io.Text.encode(Text.java:375)
>     at org.apache.hadoop.io.Text.encode(Text.java:356)
>     at org.apache.hadoop.io.Text.writeString(Text.java:396)
>     at org.apache.hadoop.mapred.JobClient$RawSplit.write(JobClient.java:428)
>     at org.apache.hadoop.mapred.JobClient.writeSplitsFile(JobClient.java:457)
>     at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:358)
>     at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:543)
>     at org.apache.nutch.crawl.Injector.inject(Injector.java:162)
>     at org.apache.nutch.crawl.Crawl.main(Crawl.java:115)
> 
> 
> ./cry
> 
> Winton

Reply via email to