[ 
https://issues.apache.org/jira/browse/PIG-343?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12620931#action_12620931
 ] 

Shravan Matthur Narayanamurthy commented on PIG-343:
----------------------------------------------------

This is a small problem in the logical to physical translator. POSplit assumes 
that the temporary storage file to which tuples will be dumped is specified. I 
have a fix and the above script works in my updated local version of the trunk. 
However, I see that the unit tests on the trunk fail. Should I be using a 
different set of unit tests? I have corrected some golden files and am 
attaching the patch. The main fix is only LogToPhyTranslationVisitor. Others 
are just to get some unit tests working.

> Simple script with SPLIT fails
> ------------------------------
>
>                 Key: PIG-343
>                 URL: https://issues.apache.org/jira/browse/PIG-343
>             Project: Pig
>          Issue Type: Bug
>    Affects Versions: types_branch
>            Reporter: Olga Natkovich
>            Assignee: Shravan Matthur Narayanamurthy
>             Fix For: types_branch
>
>         Attachments: 343.patch
>
>
> Script:
> grunt> A = load '/user/pig/tests/data/singlefile/studenttab10k' as (name, 
> age, gpa);
> grunt> split A into X if age > 19, Y if age <= 19;
> grunt> store X into 'X';
> Stack:
> 08/07/28 11:46:28 WARN pig.PigServer: bytearray is implicitly casted to 
> integer under LOGreaterThan Operator
> 08/07/28 11:46:29 ERROR grunt.GruntParser: java.io.IOException: Unable to 
> store for alias: X [null]
>         at 
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:285)
>         at org.apache.pig.PigServer.execute(PigServer.java:494)
>         at org.apache.pig.PigServer.store(PigServer.java:333)
>         at org.apache.pig.PigServer.store(PigServer.java:319)
>         at 
> org.apache.pig.tools.grunt.GruntParser.processStore(GruntParser.java:189)
>         at 
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:342)
>         at 
> org.apache.pig.tools.grunt.GruntParser.parseContOnError(GruntParser.java:92)
>         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:58)
>         at org.apache.pig.Main.main(Main.java:278)
> Caused by: org.apache.pig.backend.executionengine.ExecException
>         ... 9 more
> Caused by: 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:159)
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:104)
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:53)
>         at 
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:275)
>         ... 8 more
> Caused by: 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:159)
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:146)
>         ... 11 more
> Caused by: 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJobConf(JobControlCompiler.java:291)
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:131)
>         ... 12 more
> Caused by: java.lang.NullPointerException
>         at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJobConf(JobControlCompiler.java:243)
>         ... 13 more

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to