The problem was my fault. I got the wrong PiggyBank version out of the
repository. I managed to get it working with the correct version this
morning.

Thanks for taking a look for me.

On Sun, Jan 10, 2010 at 6:06 PM, Dmitriy Ryaboy <[email protected]> wrote:

> Jeff, I am unable to reproduce this error using pig compiled from the
> current top of the 0.6 branch and the script you provided. Are you
> sure 0.6 is what you are actually using? It hasn't been released yet.
> Do you know what svn revision the jar was compiled from?
>
> -D
>
> On Sat, Jan 9, 2010 at 5:07 PM, Dmitriy Ryaboy <[email protected]> wrote:
> > Jeff,
> > I'll check it out this weekend.
> >
> > -D
> >
> > On Sat, Jan 9, 2010 at 3:47 PM, Jeff Dalton <[email protected]>
> wrote:
> >> I downloaded the version of PiggyBank from the 0.6 branch, compiled,
> >> and deployed it.  However, I still get the same error message:
> >>
> >> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> >> to org.apache.pig.impl.PigContext
> >>        at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
> >>        at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
> >>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> >>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
> >>
> >> I'll try again later, but if anyone has any insights, I would
> >> appreciate the help.
> >>
> >> Thanks,
> >>
> >> - Jeff
> >>
> >> On Sat, Jan 9, 2010 at 6:33 PM, Jeff Dalton <[email protected]>
> wrote:
> >>> Ahh, the PiggyBank version was the latest from Trunk.  I probably need
> >>> to go track down the version from the 0.6 branch.
> >>>
> >>> On Sat, Jan 9, 2010 at 6:26 PM, Dmitriy Ryaboy <[email protected]>
> wrote:
> >>>> When you say that the code is from SVN, do you mean trunk, or the 0.6
> branch?
> >>>>
> >>>>
> >>>> On Sat, Jan 9, 2010 at 3:22 PM, Jeff Dalton <[email protected]>
> wrote:
> >>>>> A cluster I'm using was recently upgraded to PIG 0.6.  Since then,
> >>>>> I've been having problems with scripts that use PiggyBank functions.
> >>>>> All the map jobs for the script fail with:
> >>>>> WARN org.apache.hadoop.mapred.Child: Error running child
> >>>>> java.lang.ClassCastException: org.apache.pig.ExecType cannot be cast
> >>>>> to org.apache.pig.impl.PigContext
> >>>>>        at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.SliceWrapper.readFields(SliceWrapper.java:168)
> >>>>>        at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:333)
> >>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
> >>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:159)
> >>>>> INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
> >>>>>
> >>>>> I compiled the PiggyBank jar using the latest code from SVN (as of
> Jan
> >>>>> 9) and Pig 0.6.  Below I've included a simple example program that
> >>>>> caused the error that simply reads a text file of words and lower
> >>>>> cases them.
> >>>>>
> >>>>> register ./piggybank.jar
> >>>>> DEFINE ToLower org.apache.pig.piggybank.evaluation.string.LOWER();
> >>>>> words = LOAD './data/headwords_sample' USING PigStorage() as
> (word:charArray);
> >>>>> lowerCaseWords = FOREACH words GENERATE ToLower(word) as word;
> >>>>> STORE lowerCaseWords into './tmp/cooc3' USING PigStorage();
> >>>>>
> >>>>> The Hadoop error isn't very informative about what is going on.  Am I
> >>>>> using a compatible version of PiggyBank?  What should I be doing
> >>>>> differently?
> >>>>>
> >>>>> Thanks,
> >>>>>
> >>>>> - Jeff
> >>>>>
> >>>>
> >>>
> >>
> >
>

Reply via email to