[ 
https://issues.apache.org/jira/browse/SPARK-11597?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014993#comment-16014993
 ] 

Min Shen commented on SPARK-11597:
----------------------------------

Is there any further update on this ticket?
We have recently seen a scenario where we are using spark-avro to read avro 
files containing only a single record with an array field containing 135K 
elements.
While it only took 1-2 seconds for Avro to read the file and to convert the 
GenericRecord into Row, it took RowEncoder ~10min to convert the Row object 
into InternalRow.

[~cloud_fan], do you think the patch you created will help improving the 
situation here?

> improve performance of array and map encoder
> --------------------------------------------
>
>                 Key: SPARK-11597
>                 URL: https://issues.apache.org/jira/browse/SPARK-11597
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Wenchen Fan
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to