Github user zsxwing commented on the pull request:
https://github.com/apache/spark/pull/7774#issuecomment-127468687
Review status: 7 of 22 files reviewed at latest revision, 67 unresolved
discussions, all commit checks successful.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/execution/basicOperators.scala,
line 112
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQehEcEYVaSyCrk-r4-112)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/execution/basicOperators.scala#L112)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/execution/SparkPlan.scala,
line 71
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvpjDe0mLMQg8-alRGn)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkPlan.scala#L71)):</sup>
Changed them to private[sql]
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SparkPlanGraph.scala,
line 52
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvpkizmZ3pVvaR_iuiG)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SparkPlanGraph.scala#L52)):</sup>
I need to return the next node id from buildSparkPlanGraphNode and use it
to call buildSparkPlanGraphNode for a next child. But it has a return value
SparkPlanGraphNode. So I use AtomicLong to store it.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 39
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-39)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L39)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 64
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-64)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L64)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 117
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-117)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L117)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 142
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-Jvpm5Q5pWW4XPL9JV6S)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L142)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 157
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-157)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L157)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 159
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-159)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L159)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 172
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-172)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L172)):</sup>
Not update accumulators for all cases. Just Added an empty else below.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 177
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-177)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L177)):</sup>
I just found my previous codes will be confusing. I just added an empty
else and moved the comment there to clarify this race condition.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 198
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvpnK83TUW9ZFVHe00-)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L198)):</sup>
metrics only contain the accumulator definition. All SparkPlan nodes in a
tree should be unique and each one will register it own accumulators. So
`toMap` is safe. I renamed it to `sqlPlanMetrics` which looks more clear.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 221
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-221)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L221)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 249
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-249)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L249)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 252
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvppYcO2JgB6qKiIEyF)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L252)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala,
line 282
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR7-r4-282)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLListener.scala#L282)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLTab.scala, line
27
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJu-r4-27)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLTab.scala#L27)):</sup>
Done.
---
<sup>**[sql/core/src/test/scala/org/apache/spark/sql/ui/SQLListenerSuite.scala,
line 231
\[r4\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQh1eM3HWsvMBR8-r4-231)**
([raw
file](https://github.com/apache/spark/blob/94065929603633714929c5ecbd43c2a65182552a/sql/core/src/test/scala/org/apache/spark/sql/ui/SQLListenerSuite.scala#L231)):</sup>
This won't be flaky because it calls SQLListener APIs directly.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSQLExecution.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJq-r1-38)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSQLExecution.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSQLExecution.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJq-r1-26)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSQLExecution.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJt-r1-28)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJt-r1-252)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJt-r1-246)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJt-r1-234)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala#L0)):</sup>
Done.
---
<sup>**[sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala,
line 0
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7774#-JvokJQfuM98ojw4pcJt-r1-121)**
([raw
file](https://github.com/apache/spark/blob/23abf73cafac3af0363486bdae91d737e235a197/sql/core/src/main/scala/org/apache/spark/sql/ui/SQLSparkListener.scala#L0)):</sup>
Done.
---
Comments from the [review on
Reviewable.io](https://reviewable.io:443/reviews/apache/spark/7774)
<!-- Sent from Reviewable.io -->
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]