[GitHub] [spark] nija-at commented on a diff in pull request #41138: [SPARK-43457][CONNECT][PYTHON] Augument user agent with OS, Python and Spark versions
nija-at commented on code in PR #41138: URL: https://github.com/apache/spark/pull/41138#discussion_r1193456123 ## python/pyspark/sql/connect/client.py: ## @@ -299,7 +301,12 @@ def userAgent(self) -> str: raise SparkConnectException( f"'user_agent' parameter should not exceed 2048 characters, found {len} characters." ) -return user_agent +return " ".join([ Review Comment: This is the IETF standard format for user agent. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] nija-at commented on a diff in pull request #41138: [SPARK-43457][CONNECT][PYTHON] Augument user agent with OS, Python and Spark versions
nija-at commented on code in PR #41138: URL: https://github.com/apache/spark/pull/41138#discussion_r1192194721 ## python/pyspark/sql/connect/client.py: ## @@ -299,7 +300,11 @@ def userAgent(self) -> str: raise SparkConnectException( f"'user_agent' parameter should not exceed 2048 characters, found {len} characters." ) -return user_agent +return " ".join([ +user_agent, Review Comment: > In addition, we should put the spark version here as well Done -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org