dongjoon-hyun commented on PR #43184:
URL: https://github.com/apache/spark/pull/43184#issuecomment-1741849856
Although you are right, it should work in that way. However, the technical
problem is that Apache PySpark code doesn't have a conditional check in
SparkSession properly. Here is the error message when we don't have both pandas
and packaging package.
```
$ bin/pyspark
Python 3.12.0rc2 (main, Sep 21 2023, 21:22:29) [Clang 14.0.0
(clang-1400.0.28.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/shell.py", line 31,
in <module>
import pyspark
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/__init__.py", line
148, in <module>
from pyspark.sql import SQLContext, HiveContext, Row # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/sql/__init__.py",
line 43, in <module>
from pyspark.sql.context import SQLContext, HiveContext,
UDFRegistration, UDTFRegistration
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/sql/context.py", line
39, in <module>
from pyspark.sql.session import _monkey_patch_RDD, SparkSession
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/sql/session.py", line
47, in <module>
from pyspark.sql.dataframe import DataFrame
File "/Users/dongjoon/PRS/SPARK-44120/python/pyspark/sql/dataframe.py",
line 64, in <module>
from pyspark.sql.pandas.conversion import PandasConversionMixin
File
"/Users/dongjoon/PRS/SPARK-44120/python/pyspark/sql/pandas/conversion.py", line
29, in <module>
from packaging.version import Version
ModuleNotFoundError: No module named 'packaging'
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]