GitHub user kevinyu98 opened a pull request:

    https://github.com/apache/spark/pull/12893

    [Spark-15051] [SQL] Create a TypedColumn alias  

    ## What changes were proposed in this pull request?
    
    Currently when we try to create an alias against an aggregator TypedColumn, 
it is using the alias' function from Column, the function will create a column 
with TypedAggregateExpression, it is unresolved because the inputDeserializer 
is not defined. But the aggregator function will inject the inputDeserializer 
back only if it is TypedColumn, so the TypedAggregateExpression will remain 
unresolved and caused the 
    problem reported by this jira 
[15051](https://issues.apache.org/jira/browse/SPARK-15051?jql=project%20%3D%20SPARK).
    
    
    This PR propose to create a TypedColumn's own alias function which will 
return TypedColumn , when it is used with aggregator function, the aggregator 
function will inject the inputDeserializer back .
     
    ## How was this patch tested?
    
    (Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
    
    Add test cases in DatasetAggregatorSuite.scala
    run the sql related queries against this patch.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/kevinyu98/spark spark-15051

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/12893.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #12893
    
----
commit 3b44c5978bd44db986621d3e8511e9165b66926b
Author: Kevin Yu <[email protected]>
Date:   2016-04-20T18:06:30Z

    adding testcase

commit 18b4a31c687b264b50aa5f5a74455956911f738a
Author: Kevin Yu <[email protected]>
Date:   2016-04-22T21:48:00Z

    Merge remote-tracking branch 'upstream/master'

commit 4f4d1c8f2801b1e662304ab2b33351173e71b427
Author: Kevin Yu <[email protected]>
Date:   2016-04-23T16:50:19Z

    Merge remote-tracking branch 'upstream/master'
    get latest code from upstream

commit f5f0cbed1eb5754c04c36933b374c3b3d2ae4f4e
Author: Kevin Yu <[email protected]>
Date:   2016-04-23T22:20:53Z

    Merge remote-tracking branch 'upstream/master'
    adding trim characters support

commit d8b2edbd13ee9a4f057bca7dcb0c0940e8e867b8
Author: Kevin Yu <[email protected]>
Date:   2016-04-25T20:24:33Z

    Merge remote-tracking branch 'upstream/master'
    get latest code for pr12646

commit 196b6c66b0d55232f427c860c0e7c6876c216a67
Author: Kevin Yu <[email protected]>
Date:   2016-04-25T23:45:57Z

    Merge remote-tracking branch 'upstream/master'
    merge latest code

commit f37a01e005f3e27ae2be056462d6eb6730933ba5
Author: Kevin Yu <[email protected]>
Date:   2016-04-27T14:15:06Z

    Merge remote-tracking branch 'upstream/master'
    merge upstream/master

commit bb5b01fd3abeea1b03315eccf26762fcc23f80c0
Author: Kevin Yu <[email protected]>
Date:   2016-04-30T23:49:31Z

    Merge remote-tracking branch 'upstream/master'

commit 99027fa9cfd3e968bd5dc3808e8af7f8456e1f2d
Author: Kevin Yu <[email protected]>
Date:   2016-05-04T03:51:36Z

    fix

commit bde5820a181cf84e0879038ad8c4cebac63c1e24
Author: Kevin Yu <[email protected]>
Date:   2016-05-04T03:52:31Z

    Merge remote-tracking branch 'upstream/master'

commit cc8f34006c916d3a5deb50d3def9d6029b514683
Author: Kevin Yu <[email protected]>
Date:   2016-05-04T03:53:53Z

    Merge branch 'testing-jira' into spark-15051

commit 0a348415e708464ba101fb0eafa0306c01f23aee
Author: Kevin Yu <[email protected]>
Date:   2016-05-04T07:54:00Z

    fixing the typeColumn

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to