[ 
https://issues.apache.org/jira/browse/SPARK-55224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-55224:
-----------------------------------
    Labels: pull-request-available  (was: )

> Use Spark DataType as ground truth in Pandas-Arrow serialization
> ----------------------------------------------------------------
>
>                 Key: SPARK-55224
>                 URL: https://issues.apache.org/jira/browse/SPARK-55224
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 4.2.0
>            Reporter: Yicong Huang
>            Priority: Major
>              Labels: pull-request-available
>
> Currently, PySpark's Pandas serializers pass both {{spark_type}} and 
> {{arrow_type}} through the serialization pipeline. This creates confusion 
> about which type is authoritative. Analysis shows {{spark_type}} (from user's 
> UDF return type) is always the source, and {{arrow_type}} is derived via 
> {{to_arrow_type()}} which is lightweight.
> This ticket proposes refactoring {{_create_batch}} and {{_create_array}} to 
> take only {{spark_type}} and derive {{arrow_type}} internally.
> {code:python}
> # Before: caller pre-computes arrow_type
> arrow_return_type = to_arrow_type(return_type, ...)
> yield (result, arrow_return_type, return_type)
> # After: only pass spark_type
> yield (result, return_type)
> {code}
> {code:python}
> # Before: _create_array takes arrow_type as primary
> def _create_array(self, series, arrow_type, spark_type=None, 
> arrow_cast=False):
>     dt = spark_type or from_arrow_type(arrow_type)  # confusing fallback
> # After: spark_type is ground truth
> def _create_array(self, series, spark_type, *, arrow_cast=False, 
> prefers_large_types=False):
>     arrow_type = to_arrow_type(spark_type, ...) if spark_type else None
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to