HyukjinKwon commented on PR #46298:
URL: https://github.com/apache/spark/pull/46298#issuecomment-2089586686

   Doctest:
   
   ```
   File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/dataframe.py", 
line 1057, in pyspark.sql.connect.dataframe.DataFrame.union
   Failed example:
       df3.show()
   Exception raised:
       Traceback (most recent call last):
         File 
"/opt/hostedtoolcache/Python/3.11.9/x64/lib/python3.11/doctest.py", line 1355, 
in __run
           exec(compile(example.source, filename, "single",
         File "<doctest pyspark.sql.connect.dataframe.DataFrame.union[10]>", 
line 1, in <module>
           df3.show()
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/dataframe.py", 
line 996, in show
           print(self._show_string(n, truncate, vertical))
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/dataframe.py", 
line 753, in _show_string
           ).toPandas()
             ^^^^^^^^^^
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/dataframe.py", 
line 1663, in toPandas
           return self._session.client.to_pandas(query)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/client/core.py", 
line 873, in to_pandas
           table, schema, metrics, observed_metrics, _ = 
self._execute_and_fetch(
                                                         
^^^^^^^^^^^^^^^^^^^^^^^^
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/client/core.py", 
line 1283, in _execute_and_fetch
           for response in self._execute_and_fetch_as_iterator(req):
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/client/core.py", 
line 1264, in _execute_and_fetch_as_iterator
           self._handle_error(error)
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/client/core.py", 
line 1503, in _handle_error
           self._handle_rpc_error(error)
         File 
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/client/core.py", 
line 1539, in _handle_rpc_error
           raise convert_exception(info, status.message) from None
       pyspark.errors.exceptions.connect.NumberFormatException: 
[CAST_INVALID_INPUT] The value 'Alice' of the type "STRING" cannot be cast to 
"BIGINT" because it is malformed. Correct the value as per the syntax, or 
change its target type. Use `try_cast` to tolerate malformed input and return 
NULL instead. If necessary set "spark.sql.ansi.enabled" to "false" to bypass 
this error. SQLSTATE: 22018
       JVM stacktrace:
       org.apache.spark.SparkNumberFormatException: [CAST_INVALID_INPUT] The 
value 'Alice' of the type "STRING" cannot be cast to "BIGINT" because it is 
malformed. Correct the value as per the syntax, or change its target type. Use 
`try_cast` to tolerate malformed input and return NULL instead. If necessary 
set "spark.sql.ansi.enabled" to "false" to bypass this error. SQLSTATE: 22018
        at 
org.apache.spark.sql.errors.QueryExecutionErrors$.invalidInputInCastToNumberError(QueryExecutionErrors.scala:145)
        at 
org.apache.spark.sql.catalyst.util.UTF8StringUtils$.withException(UTF8StringUtils.scala:51)
        at 
org.apache.spark.sql.catalyst.util.UTF8StringUtils$.toLongExact(UTF8StringUtils.scala:31)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castToLong$2(Cast.scala:770)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castToLong$2$adapted(Cast.scala:770)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.buildCast(Cast.scala:565)
        at org.apache.spark.sql.catalyst.expressions.Cast.$anonfun$castToLong...
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to