dongjoon-hyun commented on code in PR #47402:
URL: https://github.com/apache/spark/pull/47402#discussion_r1683164754
##########
bin/spark-shell:
##########
@@ -34,7 +34,7 @@ fi
export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]
-Scala REPL options:
+Scala REPL options (Spark Classic only):
Review Comment:
This seems to be first `Spark Classic` wording outside `python`.
```
$ git grep 'Spark Classic'
python/pyspark/errors/error-conditions.json: "Calling property or
member '<member>' is not supported in PySpark Classic, please use Spark Connect
instead."
python/pyspark/sql/classic/__init__.py:"""Spark Classic specific"""
python/pyspark/sql/column.py: # Spark Classic Column by default. This is
NOT an API, and NOT supposed to
python/pyspark/sql/connect/expressions.py: # Column<'CAST(a AS
BIGINT)'> <- Spark Classic
python/pyspark/sql/dataframe.py: # Spark Classic DataFrame by default.
This is NOT an API, and NOT supposed to
python/pyspark/sql/session.py: In Spark Classic, a temporary view
referenced in `spark.sql` is resolved immediately,
python/pyspark/sql/session.py: In Spark Classic, a temporary view
referenced in `spark.table` is resolved immediately,
python/pyspark/sql/tests/connect/test_connect_dataframe_property.py:
# Using this temp env to properly invoke mapInPandas in PySpark Classic.
python/pyspark/sql/tests/connect/test_parity_types.py:
@unittest.skip("This test is dedicated for PySpark Classic.")
python/pyspark/sql/tests/connect/test_parity_types.py:
@unittest.skip("This test is dedicated for PySpark Classic.")
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]