0.23 has json-simple jar in the hadoop installation, so there is no
problem. When you go to hadoop 2.x, you will hit the same problem as
json-simple is not in 2.x hadoop installation. I actually hit the error you
mentioned with 1.x while trying 2.x.

Regards,
Rohini


On Mon, Sep 30, 2013 at 6:09 PM, j.barrett Strausser <
j.barrett.straus...@gmail.com> wrote:

> I ended up just using the .23.9 hadoop release without any issue.
>
>
>
>
> On Mon, Sep 30, 2013 at 8:54 PM, Rohini Palaniswamy <
> rohini.adi...@gmail.com
> > wrote:
>
> >  It hits this error when json-simple-1.1.jar is not in classpath. You can
> > get around that by adding it to PIG_CLASSPATH apart from registering the
> > jar.  The problem is with java classloading where it fails to load the
> > exception class(ParseException) thrown by a constructor of the class(
> > AvroStorage) from the custom classloader which includes the registered
> > jars. Can you file a jira for this? I had spent some time earlier trying
> to
> > find solution but couldn't. If fixing classloading is not possible, easy
> > thing would be to change AvroStorage constructor to throw
> RunTimeException
> > instead of ParseException
> >
> > Regards,
> > Rohini
> >
> >
> > On Thu, Sep 19, 2013 at 1:10 PM, j.barrett Strausser <
> > j.barrett.straus...@gmail.com> wrote:
> >
> > > Not my day I guess.
> > >
> > > Trying with Hadoop 1.2.x
> > >
> > > Getting :
> > >
> > >
> > > Caused by: java.lang.RuntimeException: could not instantiate
> > > 'org.apache.pig.piggybank.storage.avro.AvroStorage' with arguments
> 'null'
> > >         at
> > >
> >
> org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:618)
> > >
> > > Caused by: java.lang.NoClassDefFoundError:
> > > org/json/simple/parser/ParseException
> > >         at java.lang.Class.getDeclaredConstructors0(Native Method)
> > >
> > >
> > >
> > > When I attempt to load the relation using
> > >
> > > Load 'path' USING org.apache.pig.piggybank.storage.avro.AvroStorage();
> > >
> > > I've registered  :  json-simple-1.1.jar
> > >
> > >
> > > On Thu, Sep 19, 2013 at 3:14 PM, j.barrett Strausser <
> > > j.barrett.straus...@gmail.com> wrote:
> > >
> > > > Are the releases from the download page not compatible with 23.x? or
> > 2.X
> > > >
> > > > Says they are -
> > > >
> > >
> >
> http://pig.apache.org/releases.html#1+April%2C+2013%3A+release+0.11.1+available
> > > >
> > > > In any case I tried it with .23.9 and received a different error:
> > > >
> > > > 2013-09-19 15:13:56,044 [main] WARN
> > > > org.apache.pig.backend.hadoop20.PigJobControl - falling back to
> default
> > > > JobControl (not using hadoop 0.20 ?)
> > > > java.lang.NoSuchFieldException: runnerState
> > > >     at java.lang.Class.getDeclaredField(Class.java:1938)
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > On Thu, Sep 19, 2013 at 2:24 PM, Mark Wagner <
> wagner.mar...@gmail.com
> > > >wrote:
> > > >
> > > >> It sounds like you're using a version of Pig that wasn't compiled
> for
> > > >> Hadoop 2.x/.23. Try recompiling with 'ant clean jar
> > > >> -Dhadoopversion=23'.
> > > >>
> > > >> -Mark
> > > >>
> > > >> On Thu, Sep 19, 2013 at 9:23 AM, j.barrett Strausser
> > > >> <j.barrett.straus...@gmail.com> wrote:
> > > >> > Running
> > > >> >
> > > >> > Hadoop-2.1.0-Beta
> > > >> > Pig-0.11.1
> > > >> > Hive-0.11.1
> > > >> >
> > > >> > 1. Created Avro backed table in Hive.
> > > >> > 2. Loaded the table in Pig - records = Load '/path' USING
> > > >> > org.apache.pig.piggybank.storage.avro.AvroStorage();
> > > >> > 3. Can successfully describe the relation.
> > > >> >
> > > >> > I registered the following on pig start :
> > > >> > REGISTER piggybank.jar
> > > >> > REGISTER avro-*.jar
> > > >> > REGISTER jackson-core-asl-1.8.8.jar
> > > >> > REGISTER jackson-mapper-asl-1.8.8.jar
> > > >> > REGISTER json-simple-1.1.jar
> > > >> > REGISTER snappy-java-1.0.3.2.jar
> > > >> >
> > > >> > The avro tools are at 1.7.5
> > > >> >
> > > >> >
> > > >> >
> > > >> >
> > > >> >
> > > >> > *
> > > >> > *
> > > >> > *Running Dump Produces the following*:
> > > >> >
> > > >> > 2013-09-19 12:08:21,639 [JobControl] ERROR
> > > >> > org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error
> while
> > > >> trying
> > > >> > to run jobs.
> > > >> > java.lang.IncompatibleClassChangeError: Found interface
> > > >> > org.apache.hadoop.mapreduce.JobContext, but class was expected
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340)
> > > >> >
> > > >> >
> > > >> >
> > > >> > 2013-09-19 12:08:21,651 [main] ERROR
> > > >> > org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable
> to
> > > >> > recreate exception from backend error: Unexpected System Error
> > > Occured:
> > > >> > java.lang.IncompatibleClassChangeError: Found interface
> > > >> > org.apache.hadoop.mapreduce.JobContext, but class was expected
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:441)
> > > >> >     at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:340)
> > > >> >
> > > >> > *Running illustrate :*
> > > >> >
> > > >> > Pig Stack Trace
> > > >> > ---------------
> > > >> > ERROR 1070: Could not resolve
> > > >> > org.apache.pig.piggybank.storage.avro.AvroStorage using imports:
> [,
> > > >> > org.apache.pig.builtin., org.apache.pig.impl.builtin.]
> > > >> >
> > > >> > Pig Stack Trace
> > > >> > ---------------
> > > >> > ERROR 2998: Unhandled internal error.
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;L$
> > > >> >
> > > >> > java.lang.NoSuchMethodError:
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.Mapper$Context.<init>(Lorg/apache/hadoop/mapreduce/Mapper;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;Lorg/apach$
> > > >> >
> > > >> >
> > > >> >
> > > >> > Any thougths?
> > > >> >
> > > >> > --
> > > >> >
> > > >> >
> > > >> > https://github.com/bearrito
> > > >> > @deepbearrito
> > > >>
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > >
> > > > https://github.com/bearrito
> > > > @deepbearrito
> > > >
> > >
> > >
> > >
> > > --
> > >
> > >
> > > https://github.com/bearrito
> > > @deepbearrito
> > >
> >
>
>
>
> --
>
>
> https://github.com/bearrito
> @deepbearrito
>

Reply via email to