Hi,

Sorry to bother you guys again, but it seems that no matter what I do I
can't run the new version of Nutch with Hadoop 0.20.

I am getting the following exceptions in my logs when I execute
bin/start-all.sh

I don't know what to do! I've tried all kind of stuff but with no luck... :(

*hadoop-eran-jobtracker-master.log*
2009-12-09 12:04:53,965 FATAL mapred.JobTracker -
java.lang.SecurityException: sealing violation: can't seal package
org.mortbay.util: already loaded
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:235)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
    at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1610)
    at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:180)
    at org.apache.hadoop.mapred.JobTracker.startTracker(JobTracker.java:172)
    at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:3699)

*hadoop-eran-namenode-master.log*
2009-12-09 12:04:27,583 ERROR namenode.NameNode -
java.lang.SecurityException: sealing violation: can't seal package
org.mortbay.util: already loaded
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:235)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:220)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:202)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:279)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:956)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965)

Thanks for trying to help,
Eran

On Sun, Dec 6, 2009 at 3:51 PM, Eran Zinman <[email protected]> wrote:

> Hi,
>
> Just upgraded to the latest version of Nutch with Hadoop 0.20.
>
> I'm getting the following exception in the namenode log and DFS doesn't
> start:
>
> 2009-12-06 15:48:32,523 ERROR namenode.NameNode -
> java.lang.SecurityException: sealing violation: can't seal package
> org.mortbay.util: already loaded
>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:235)
>     at java.net.URLClassLoader.access$000(URLClassLoader.java:56)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>     at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:220)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:202)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:279)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:956)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965)
>
> Any help will be appreciated ... quite stuck with this.
>
> Thanks,
> Eran
>

Reply via email to