[
https://issues.apache.org/jira/browse/SPARK-34751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17302573#comment-17302573
]
Takeshi Yamamuro commented on SPARK-34751:
------------------------------------------
Could you try newer Spark, e.g., 2.4.7, 3.0.2, or 3.1.1?
> Parquet with invalid chars on column name reads double as null when a clean
> schema is applied
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-34751
> URL: https://issues.apache.org/jira/browse/SPARK-34751
> Project: Spark
> Issue Type: Bug
> Components: Input/Output
> Affects Versions: 2.4.3
> Environment: Pyspark 2.4.3
> AWS Glue Dev Endpoint EMR
> Reporter: Nivas Umapathy
> Priority: Major
> Fix For: 2.4.8
>
> Attachments: invalid_columns_double.parquet
>
>
> I have a parquet file that has data with invalid column names on it.
> [#Reference](https://issues.apache.org/jira/browse/SPARK-27442) Here is the
> file attached with this ticket.
> I tried to load this file with
> {{df = glue_context.read.parquet('invalid_columns_double.parquet')}}
> {{df = df.withColumnRenamed('COL 1', 'COL_1')}}
> {{df = df.withColumnRenamed('COL,2', 'COL_2')}}
> {{df = df.withColumnRenamed('COL;3', 'COL_3') }}
> and so on.
> Now if i call
> {{df.show()}}
> it throws this exception that is still pointing to the old column name.
> {{pyspark.sql.utils.AnalysisException: 'Attribute name "COL 1" contains
> invalid character(s) among " ,;{}()}}
> {{n}}
> {{t=". Please use alias to rename it.;'}}
>
> When i read about it in some blogs, there was suggestion to re-read the same
> parquet with new schema applied. So i did
> {{df =
> glue_context.read.schema(df.schema).parquet(}}{{'invalid_columns_double.parquet')}}
>
> and it works, but all the data in the dataframe are null. The same works for
> String datatypes
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]