[jira] [Updated] (SPARK-5738) Reuse mutable row for each record at jsonStringToRow
[ https://issues.apache.org/jira/browse/SPARK-5738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-5738: - Assignee: Yanbo Liang > Reuse mutable row for each record at jsonStringToRow > > > Key: SPARK-5738 > URL: https://issues.apache.org/jira/browse/SPARK-5738 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 1.3.0 >Reporter: Yanbo Liang >Assignee: Yanbo Liang > > Other table scan like operations include ParquetTableScan, HiveTableScan use > a reusable mutable row for seralization to decrease garbage. We also make > JSONRelation#buildScan() with this optimization. > When serialize json string to row, reuse a mutable row for both each record > and inner nested structure instead of creating a new one for each. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-5738) Reuse mutable row for each record at jsonStringToRow
[ https://issues.apache.org/jira/browse/SPARK-5738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yin Huai updated SPARK-5738: Summary: Reuse mutable row for each record at jsonStringToRow (was: [SQL] Reuse mutable row for each record at jsonStringToRow) Reuse mutable row for each record at jsonStringToRow Key: SPARK-5738 URL: https://issues.apache.org/jira/browse/SPARK-5738 Project: Spark Issue Type: Sub-task Components: SQL Affects Versions: 1.3.0 Reporter: Yanbo Liang Other table scan like operations include ParquetTableScan, HiveTableScan use a reusable mutable row for seralization to decrease garbage. We also make JSONRelation#buildScan() with this optimization. When serialize json string to row, reuse a mutable row for both each record and inner nested structure instead of creating a new one for each. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org