[
https://issues.apache.org/jira/browse/SPARK-33001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17231703#comment-17231703
]
Danny Lee commented on SPARK-33001:
-----------------------------------
[~xorz57] I'm getting the same error, right before Spark bails and throws:
20/11/13 12:41:51 ERROR MicroBatchExecution: Query [id =
32320bc7-d7ba-49b4-8a56-1166a4f2d6db, runId =
d7cc93c2-41ef-4765-aecd-9cd453c25905] terminated with error
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut
down
Its related to line 97 in this file, but there isn't much documentation.
Executor metrics appear relatively new, even [~wypoon] n (last committer)
admits "Executor metrics are new in Spark 3.0. They lack documentation."
I'll post back if I find a solution.
> Why am I receiving this warning?
> --------------------------------
>
> Key: SPARK-33001
> URL: https://issues.apache.org/jira/browse/SPARK-33001
> Project: Spark
> Issue Type: Question
> Components: Spark Core
> Affects Versions: 3.0.1
> Reporter: George Fotopoulos
> Priority: Major
>
> I am running Apache Spark Core using Scala 2.12.12 on IntelliJ IDEA 2020.2
> with Docker 2.3.0.5
> I am running Windows 10 build 2004
> Can somebody explain me why am I receiving this warning and what can I do
> about it?
> I tried googling this warning but, all I found was people asking about it and
> no answers.
> [screenshot|https://user-images.githubusercontent.com/1548352/94319642-c8102c80-ff93-11ea-9fea-f58de8da2268.png]
> {code:scala}
> WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a
> result reporting of ProcessTree metrics is stopped
> {code}
> Thanks in advance!
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]