[ https://issues.apache.org/jira/browse/SPARK-32500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17173633#comment-17173633 ]
JinxinTang commented on SPARK-32500: ------------------------------------ I have found the root cause is `org.apache.spark.SparkContext#localProperties` is thread local, the ` spark.job.description` is seted by stream execution thread, and when we use python through py4j server to save, the py4j server thread belong to main thread group, so `org.apache.spark.sql.execution.ui.SparkListenerSQLExecutionStart$#apply` can not get `desc` property from spark context. When we use scala to operate, there is no problem because they are all belong to stream execution thread. > Query and Batch Id not set for Structured Streaming Jobs in case of > ForeachBatch in PySpark > ------------------------------------------------------------------------------------------- > > Key: SPARK-32500 > URL: https://issues.apache.org/jira/browse/SPARK-32500 > Project: Spark > Issue Type: Bug > Components: PySpark, Structured Streaming > Affects Versions: 2.4.6 > Reporter: Abhishek Dixit > Priority: Major > Attachments: Screen Shot 2020-07-26 at 6.50.39 PM.png, Screen Shot > 2020-07-30 at 9.04.21 PM.png, image-2020-08-01-10-21-51-246.png > > > Query Id and Batch Id information is not available for jobs started by > structured streaming query when _foreachBatch_ API is used in PySpark. > This happens only with foreachBatch in pyspark. ForeachBatch in scala works > fine, and also other structured streaming sinks in pyspark work fine. I am > attaching a screenshot of jobs pages. > I think job group is not set properly when _foreachBatch_ is used via > pyspark. I have a framework that depends on the _queryId_ and _batchId_ > information available in the job properties and so my framework doesn't work > for pyspark-foreachBatch use case. > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org