[ 
https://issues.apache.org/jira/browse/SPARK-2890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust reopened SPARK-2890:
-------------------------------------

      Assignee: Michael Armbrust

There are cases where this does actually make it impossible to read data.  For 
example when you are trying read a schema that becomes ambiguous due to case 
insensitive resolution. So I think we should do something here.

> Spark SQL should allow SELECT with duplicated columns
> -----------------------------------------------------
>
>                 Key: SPARK-2890
>                 URL: https://issues.apache.org/jira/browse/SPARK-2890
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: Jianshi Huang
>            Assignee: Michael Armbrust
>
> Spark reported error java.lang.IllegalArgumentException with messages:
> java.lang.IllegalArgumentException: requirement failed: Found fields with the 
> same name.
>         at scala.Predef$.require(Predef.scala:233)
>         at 
> org.apache.spark.sql.catalyst.types.StructType.<init>(dataTypes.scala:317)
>         at 
> org.apache.spark.sql.catalyst.types.StructType$.fromAttributes(dataTypes.scala:310)
>         at 
> org.apache.spark.sql.parquet.ParquetTypesConverter$.convertToString(ParquetTypes.scala:306)
>         at 
> org.apache.spark.sql.parquet.ParquetTableScan.execute(ParquetTableOperations.scala:83)
>         at 
> org.apache.spark.sql.execution.Filter.execute(basicOperators.scala:57)
>         at 
> org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:85)
>         at org.apache.spark.sql.SchemaRDD.collect(SchemaRDD.scala:433)
> After trial and error, it seems it's caused by duplicated columns in my 
> select clause.
> I made the duplication on purpose for my code to parse correctly. I think we 
> should allow users to specify duplicated columns as return value.
> Jianshi



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to