HeartSaVioR edited a comment on issue #23854: [SPARK-22000][SQL] Use 
String.valueOf in generated code to assign value to String type of field in 
Java Bean Encoder
URL: https://github.com/apache/spark/pull/23854#issuecomment-466269976
 
 
   Sure I think it sounds much better, but as a view of newcomer of Spark SQL 
it doesn't seem to be trivial.
   
   > Probably, it seems we can add cast exprs before collecting data?
   > 
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala#L3360
   
   I guess we also need to handle `Dataset.rdd` as well, and both path looks 
like going through `CatalystSerde.deserialize` to create DeserializeToObject. 
So I looked through how to deal with types of logical plan and deserializer, 
but `deserializer` has type of fields as JavaType (like 
`ObjectType(java.lang.String)` for String field) which we need to match and 
decide against Spark SQL type. Does Spark have some code to match Spark SQL 
DataTypes to Java ObjectTypes, or even handle cast?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to