[
https://issues.apache.org/jira/browse/SPARK-54210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18038391#comment-18038391
]
Cheng Pan commented on SPARK-54210:
-----------------------------------
[~dongjoon] I might have overlooked it previously, the current user agent for
the Connect JDBC driver is fine. SPARK-45485 actually always appends
scala/jvm/os info to the user-provided agent, so I don't need to construct it
again.
I checked the Spark UI, the pyspark user agent is:
client_type: "_SPARK_CONNECT_PYTHON spark/4.2.0.dev0 os/darwin python/3.11.9"
the spark-shell (Connect Shell) user agent is:
client_type: "Spark Connect REPL spark/4.2.0-SNAPSHOT scala/2.13.17
jvm/17.0.13+11-LTS os/darwin"
the Connect JDBC driver user agent is:
client_type: "Spark Connect JDBC spark/4.2.0-SNAPSHOT scala/2.13.17
jvm/17.0.13+11-LTS os/darwin"
I think this is fine, and will close this issue as Not A Problem.
> Canonicalize user agent of Connect JDBC driver
> ----------------------------------------------
>
> Key: SPARK-54210
> URL: https://issues.apache.org/jira/browse/SPARK-54210
> Project: Spark
> Issue Type: Sub-task
> Components: Connect
> Affects Versions: 4.1.0
> Reporter: Cheng Pan
> Priority: Major
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]