Interesting because I am able to reproduce your error by passing
<jar1>::<jar2> to -Dpig.additional.jar. Setting PIG_CLASSPATH itself with
:: seems fine, but passing it to -Dpig.additional.jar is not.

If you still run into a failure, can you double-check whether you're seeing
the same error or a different one? You shouldn't see the same error.

Cheolsoo

On Mon, Nov 12, 2012 at 9:59 AM, Michał Czerwiński <[email protected]
> wrote:

> Nice spot on, but it seems that's not the problem...I am able to run
> this successfully on pig 0.9 and 0.10 any ideas how further I can debug the
> problem?
>
> I also double checked all the variables, seems to be fine:
>
> PIG_CLASSPATH=/opt/hcat/share/hcatalog/hcatalog-0.4.0.jar:/usr/lib/hive/conf:/usr/lib/hadoop-0.20/conf
>
> /usr/lib/hive/conf and /usr/lib/hadoop-0.20/conf also exist.
>
> Thanks.
>
> On 12 November 2012 17:37, Cheolsoo Park <[email protected]> wrote:
>
> > Hi Michal,
> >
> > Caused by: java.lang.IllegalArgumentException: Can not create a Path
> > from an empty string
> > at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
> > at org.apache.hadoop.fs.Path.<init>(Path.java:90)
> > at org.apache.hadoop.fs.Path.<init>(Path.java:45
> >
> > Your error message indicates that there is a typo somewhere in paths. I
> > believe that your PIG_CLASSPATH is the problem:
> >
> >
> >
> PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar::$HIVE_HOME/conf:$HADOOP_HOME/conf
> >
> > You have a double colon :: in the middle, and that will be interpreted as
> > an empty string.
> >
> > Thanks,
> > Cheolsoo
> >
> > On Mon, Nov 12, 2012 at 8:47 AM, Michał Czerwiński <
> > [email protected]
> > > wrote:
> >
> > > I am trying to use pig 0.11 and pig trunk (currently 0.12) because pig
> > 0.10
> > > seems to be having issues with python udf...
> > >
> > > According to this
> > > http://www.mail-archive.com/[email protected]/msg05837.html
> > >
> > > " after replacing pig.jar and pig-withouthadoop.jar with the
> > > 0.11 ones from the svn trunk, they work like a charm."
> > >
> > > Well this is clearly not the case for me...
> > >
> > > The error I get is:
> > >
> > > Pig Stack Trace
> > > ---------------
> > > ERROR 2017: Internal error creating job configuration.
> > >
> > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
> to
> > > open iterator for alias ll
> > >  at org.apache.pig.PigServer.openIterator(PigServer.java:841)
> > > at
> > org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
> > >  at
> > >
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
> > > at
> > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > >  at
> > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> > >  at org.apache.pig.Main.run(Main.java:535)
> > > at org.apache.pig.Main.main(Main.java:154)
> > >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >  at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > >  at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
> > > Caused by: org.apache.pig.PigException: ERROR 1002: Unable to store
> alias
> > > ll
> > > at org.apache.pig.PigServer.storeEx(PigServer.java:940)
> > >  at org.apache.pig.PigServer.store(PigServer.java:903)
> > > at org.apache.pig.PigServer.openIterator(PigServer.java:816)
> > >  ... 12 more
> > > Caused by:
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException:
> > > ERROR 2017: Internal error creating job configuration.
> > >  at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:848)
> > > at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:294)
> > >  at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
> > > at org.apache.pig.PigServer.launchPlan(PigServer.java:1269)
> > >  at
> > >
> org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1254)
> > > at org.apache.pig.PigServer.storeEx(PigServer.java:936)
> > >  ... 14 more
> > > Caused by: java.lang.IllegalArgumentException: Can not create a Path
> from
> > > an empty string
> > > at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
> > >  at org.apache.hadoop.fs.Path.<init>(Path.java:90)
> > > at org.apache.hadoop.fs.Path.<init>(Path.java:45)
> > >  at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.shipToHDFS(JobControlCompiler.java:1455)
> > > at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.putJarOnClassPathThroughDistributedCache(JobControlCompiler.java:1432)
> > >  at
> > >
> > >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:508)
> > > ... 19 more
> > >
> > >
> >
> ================================================================================
> > >
> > > I am using the following startup script:
> > >
> > > export HADOOP_HOME=/usr/lib/hadoop-0.20
> > > export HCAT_HOME=/opt/hcat
> > > export HIVE_HOME=/usr/lib/hive
> > >
> > >
> > >
> >
> PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar::$HIVE_HOME/conf:$HADOOP_HOME/conf
> > >
> > > for file in $HIVE_HOME/lib/*.jar; do
> > >     echo "==> Adding $file"
> > >     PIG_CLASSPATH=$PIG_CLASSPATH:$file
> > > done
> > >
> > > export PIG_OPTS=-Dhive.metastore.uris=thrift://
> > > appserver.hadoop.staging.qutics.com:10002
> > > export JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::")
> > > exec bin/pig -Dpig.additional.jars=$PIG_CLASSPATH "$@"
> > >
> > > Any clues?
> > > Thank you!
> > >
> >
>

Reply via email to