GitHub user biglobster opened a pull request:
https://github.com/apache/spark/pull/14374
Keliang
## What changes were proposed in this pull request?
In Spark 2.0, we will parse float literals as decimals. However, it
introduces a side-effect, which is described below.
Before
spark-sql> select map(0.1,0.01, 0.2,0.033);
Error in query: cannot resolve 'map(CAST(0.1 AS DECIMAL(1,1)),
CAST(0.01 AS DECIMAL(2,2)), CAST(0.2 AS DECIMAL(1,1)), CAST(0.033 AS
DECIMAL(3,3)))' due to data type mismatch: The given values of function map
should all be the same type, but they are [decimal(2,2), decimal(3,3)]; line 1
pos 7
After
spark-sql> select map(0.1,0.01, 0.2,0.033);
{0.1:0.010,0.2:0.033}
Time taken: 2.448 seconds, Fetched 1 row(s)
## How was this patch tested?
Pass the run-tests with a new test case.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/biglobster/spark keliang
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/14374.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #14374
----
commit c9651fd8c6b3f2e6b5017610b3335b62a7d984ee
Author: keliang <[email protected]>
Date: 2016-07-26T11:32:35Z
Summary: Fail to create a map contains decimal type with literals having
different inferred precessions and scales
JIRA_ID:SPARK-16735
Description:
In Spark 2.0, we will parse float literals as decimals. However, it
introduces a side-effect, which is described below.
spark-sql> select map(0.1,0.01, 0.2,0.033);
Error in query: cannot resolve 'map(CAST(0.1 AS DECIMAL(1,1)), CAST(0.01 AS
DECIMAL(2,2)), CAST(0.2 AS DECIMAL(1,1)), CAST(0.033 AS DECIMAL(3,3)))' due to
data type mismatch: The given values of function map should all be the same
type, but they are [decimal(2,2), decimal(3,3)]; line 1 pos 7
Test:
spark-sql> select map(0.1,0.01, 0.2,0.033);
{0.1:0.010,0.2:0.033}
Time taken: 2.448 seconds, Fetched 1 row(s)
commit ba2560e25eebac7cfbc7e3cd65ae94ca2ceae6d4
Author: keliang <[email protected]>
Date: 2016-07-26T13:31:09Z
Summary:Fail to create a map contains decimal type with literals having
different inferred precessions and scales
JIRA_ID:SPARK-16735
Description:
In Spark 2.0, we will parse float literals as decimals. However, it
introduces a side-effect, which is described below.
spark-sql> select map(0.1,0.01, 0.2,0.033);
Error in query: cannot resolve 'map(CAST(0.1 AS DECIMAL(1,1)), CAST(0.01 AS
DECIMAL(2,2)), CAST(0.2 AS DECIMAL(1,1)), CAST(0.033 AS DECIMAL(3,3)))' due to
data type mismatch: The given values of function map should all be the same
type, but they are [decimal(2,2), decimal(3,3)]; line 1 pos 7
Test:
spark-sql> select map(0.1,0.01, 0.2,0.033);
{0.1:0.010,0.2:0.033}
Time taken: 2.448 seconds, Fetched 1 row(s)
commit 3fa2153dadb05762899481f8a4416741f0b4190a
Author: keliang <[email protected]>
Date: 2016-07-26T22:20:48Z
Summary: Fail to create a map contains decimal type with literals having
different inferred precessions and scales
JIRA_ID: SPARK-16735
Description: In Spark 2.0, we will parse float literals as decimals.
However, it introduces a side-effect, which is described below.
spark-sql> select map(0.1,0.01, 0.2,0.033);
Error in query: cannot resolve 'map(CAST(0.1 AS DECIMAL(1,1)), CAST(0.01 AS
DECIMAL(2,2)), CAST(0.2 AS DECIMAL(1,1)), CAST(0.033 AS DECIMAL(3,3)))' due to
data type mismatch: The given values of function map should all be the same
type, but they are [decimal(2,2), decimal(3,3)]; line 1 pos 7
Test:spark-sql> select map(0.1,0.01, 0.2,0.033);
{0.1:0.010,0.2:0.033}
Time taken: 2.448 seconds, Fetched 1 row(s)
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]