Jason White created SPARK-22605:
-----------------------------------
Summary: OutputMetrics empty for DataFrame writes
Key: SPARK-22605
URL: https://issues.apache.org/jira/browse/SPARK-22605
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 2.2.0
Reporter: Jason White
Priority: Minor
I am trying to use the SparkListener interface to hook up some custom
monitoring for some of our critical jobs. Among the first metrics I would like
is an output row count & size metric. I'm using PySpark and the Py4J interface
to implement the listener.
I am able to see the recordsRead and bytesRead metrics via the
taskEnd.taskMetrics().inputMetrics().recordsRead() and .bytesRead() methods.
taskEnd.taskMetrics().outputMetrics().recordsWritten() and .bytesWritten() are
always 0. I see similar output if I use the stageCompleted event instead.
To trigger execution, I am using df.write.parquet(path). If I use
df.rdd.saveAsTextFile(path) instead, the counts and bytes are correct.
Another clue that this bug is deeper in Spark SQL is that the Spark Application
Master doesn't show the Output Size / Records column with df.write.parquet or
df.write.text, but does with df.rdd.saveAsTextFile. Since the Spark Application
Master also gets its output via the Listener interface, this would seem related.
There is a related PR: https://issues.apache.org/jira/browse/SPARK-21882, but I
believe this to be a distinct issue.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]