the problem is that the netty classes need to be accessible to the
tasks running in hadoop. i think the netty classes should be jarred
into the penny.jar, so that they are distributed properly. unless
someone else has a better idea.

ben

On Tue, Jul 26, 2011 at 10:02 AM, Doug Daniels <[email protected]> wrote:
> Hi,
>
> I'm trying to run the data sampler tool from the penny library, and am 
> getting a ClassNotFoundException for a netty class.  I'm using the trunk 
> version of pig, with the patch from PIG-2013 applied.
>
> I'm running a simple script that uses pig test data from 
> test/org/apache/pig/test/data/InputFiles/jsTst1.txt :
>
>    x = LOAD 'jsTst1.txt' USING PigStorage('\t');
>    x_filtered = FILTER x BY (int)$1 > 100;
>    STORE x_filtered INTO 'jsTst1Filtered';
>
> To run it, I tried the syntax from 
> https://cwiki.apache.org/confluence/display/PIG/PennyToolLibrary, but I was 
> getting a ClassNotFoundException on org.jboss.netty.channel.ChannelFactory 
> before the job even started running.  I added the netty-3.2.2.Final.jar from 
> pig's ivy libs to the -cp list, which fixed that ClassNotFoundException, but 
> left me with a new one after the job started:
>
>
> 11/07/26 16:44:13 WARN mapReduceLayer.Launcher: There is no log file to write 
> to.
>
> 11/07/26 16:44:13 ERROR mapReduceLayer.Launcher: Backend error message
>
> Error: java.lang.ClassNotFoundException: 
> org.jboss.netty.channel.SimpleChannelHandler
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
> at java.lang.ClassLoader.defineClass1(Native Method)
>
> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>
> at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>
> at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
> at 
> org.apache.pig.penny.impl.harnesses.MonitorAgentHarness.initialize(MonitorAgentHarness.java:229)
>
> at org.apache.pig.penny.impl.pig.MonitorAgentUDF.init(MonitorAgentUDF.java:61)
>
> at org.apache.pig.penny.impl.pig.MonitorAgentUDF.exec(MonitorAgentUDF.java:72)
>
> at org.apache.pig.penny.impl.pig.MonitorAgentUDF.exec(MonitorAgentUDF.java:37)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:216)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POUserFunc.getNext(POUserFunc.java:258)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.getNext(PhysicalOperator.java:316)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(POForEach.java:332)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(POForEach.java:284)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:290)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POFilter.getNext(POFilter.java:95)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:290)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(POForEach.java:233)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:267)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:262)
>
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:64)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:170)
>
> Should I be running penny in a different way?
>
> Thanks,
> Doug
>
>
>
>

Reply via email to