[ 
https://issues.apache.org/jira/browse/PIG-1636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Dai resolved PIG-1636.
-----------------------------

    Hadoop Flags: [Reviewed]
      Resolution: Fixed

Patch committed to both trunk and 0.8 branch.

> Scalar fail if the scalar variable is generated by limit
> --------------------------------------------------------
>
>                 Key: PIG-1636
>                 URL: https://issues.apache.org/jira/browse/PIG-1636
>             Project: Pig
>          Issue Type: Bug
>          Components: impl
>    Affects Versions: 0.8.0
>            Reporter: Daniel Dai
>            Assignee: Daniel Dai
>             Fix For: 0.8.0
>
>         Attachments: PIG-1636-1.patch
>
>
> The following script fail:
> {code}
> a = load 'studenttab10k' as (name: chararray, age: int, gpa: float);
> b = group a all;
> c = foreach b generate SUM(a.age) as total;
> c1= limit c 1;
> d = foreach a generate name, age/(double)c1.total as d_sum;
> store d into '111';
> {code}
> The problem is we have a reference to c1 in d. In the optimizer, we push 
> limit before foreach, d still reference to limit, and we get the wrong schema 
> for the scalar.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to