[
https://issues.apache.org/jira/browse/SPARK-16041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Xiao Li updated SPARK-16041:
----------------------------
Description:
Duplicate columns are not allowed in `partitionBy`, `blockBy`, `sortBy` in
DataFrameWriter. The duplicate columns could cause unpredictable results. For
example, the resolution failure.
We should detect the duplicates and issue exceptions with appropriate messages.
was:
Duplicate columns are not allowed in `partitionBy`, `blockBy`, `sortBy` in .
The duplicate columns could cause unpredictable results. For example, the
resolution failure.
We should detect the duplicates and issue exceptions with appropriate messages.
> Disallow Duplicate Columns in `partitionBy`, `blockBy` and `sortBy` in
> DataFrameWriter
> --------------------------------------------------------------------------------------
>
> Key: SPARK-16041
> URL: https://issues.apache.org/jira/browse/SPARK-16041
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Xiao Li
>
> Duplicate columns are not allowed in `partitionBy`, `blockBy`, `sortBy` in
> DataFrameWriter. The duplicate columns could cause unpredictable results. For
> example, the resolution failure.
> We should detect the duplicates and issue exceptions with appropriate
> messages.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]