HyukjinKwon commented on code in PR #39224:
URL: https://github.com/apache/spark/pull/39224#discussion_r1057436437
##########
python/pyspark/sql/catalog.py:
##########
@@ -273,15 +273,15 @@ def databaseExists(self, dbName: str) -> bool:
>>> spark.catalog.databaseExists("test_new_database")
False
- >>> _ = spark.sql("CREATE DATABASE test_new_database")
+ >>> _ = spark.sql("CREATE DATABASE test_new_database").collect()
Review Comment:
Hmmm ... this is orthogonal issue but just to make sure we don't forget. I
think we should make `sql` method in Spark Connect to be analyzed per every
call. Otherwise, we can't make it compatible with the existing PySpark usage.
I changes the existing doctest here for now to make the tests pass but
should change this back ideally.
##########
python/pyspark/sql/catalog.py:
##########
@@ -273,15 +273,15 @@ def databaseExists(self, dbName: str) -> bool:
>>> spark.catalog.databaseExists("test_new_database")
False
- >>> _ = spark.sql("CREATE DATABASE test_new_database")
+ >>> _ = spark.sql("CREATE DATABASE test_new_database").collect()
Review Comment:
cc @hvanhovell @grundprinzip FYI
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]