Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/7770#issuecomment-127071752
  
    
    
    
    
    Review status: 31 of 49 files reviewed at latest revision, 16 unresolved 
discussions, some commit checks failed.
    
    ---
    
    
<sup>**[core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleExternalSorter.java,
 line 89 
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvVHimfV9ZktQGXtvJZ-r1-89)**
 ([raw 
file](https://github.com/apache/spark/blob/5b5e6f36b8a0e37f1953e12c438e01c58872e5fa/core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleExternalSorter.java#L89)):</sup>
    fixed.
    
    ---
    
    
<sup>**[core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleWriter.java,
 line 455 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvjyFnipw8r_uM6llmJ)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleWriter.java#L455)):</sup>
    UnsafeShuffleWriterSuite. I can add that to the comment
    
    ---
    
    
<sup>**[core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleWriter.java,
 line 459 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvjySxUYG3yq_glTmT-)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleWriter.java#L459)):</sup>
    It didn't work when I tried, but I can see if there's a way to make it work 
again
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/Accumulators.scala, line 157 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvjz0QAGo9oQR2jYhzP)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/Accumulators.scala#L157)):</sup>
    sounds good
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/Accumulators.scala, line 264 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvjzLl6KJl6UVVA1CmA)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/Accumulators.scala#L264)):</sup>
    ok
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/Accumulators.scala, line 268 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvjzK9DHuqJ_ogH2Wg4)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/Accumulators.scala#L268)):</sup>
    k
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala, 
line 791 
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvVHinFI4nT0TEsx8sk-r1-791)**
 ([raw 
file](https://github.com/apache/spark/blob/5b5e6f36b8a0e37f1953e12c438e01c58872e5fa/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L791)):</sup>
    This only deals with internal accumulators, which are new, right? What old 
behavior are you referring to?
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/scheduler/Stage.scala, line 78 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvk-IpC7oolKQcwsJeO)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/scheduler/Stage.scala#L78)):</sup>
    sure
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/TaskContext.scala, line 65 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvk-YWmoTLAUs2KfDN3)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/TaskContext.scala#L65)):</sup>
    ok
    
    ---
    
    <sup>**[core/src/main/scala/org/apache/spark/TaskContext.scala, line 67 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvk-blSJZ0ZULb49BEx)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/main/scala/org/apache/spark/TaskContext.scala#L67)):</sup>
    I tried it. I couldn't get it to compile. It was complaining that 
`TaskContext` has no member `empty()` or something.
    
    ---
    
    <sup>**[core/src/test/scala/org/apache/spark/CacheManagerSuite.scala, line 
89 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvk1DKhxW_LXgEZcwle)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/core/src/test/scala/org/apache/spark/CacheManagerSuite.scala#L89)):</sup>
    k, let's do that outside of this patch.
    
    ---
    
    
<sup>**[sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/UnsafeFixedWidthAggregationMap.java,
 line 206 
\[r1\]](https://reviewable.io:443/reviews/apache/spark/7770#-JvVHinOBxo83OCW0DdG-r1-212)**
 ([raw 
file](https://github.com/apache/spark/blob/5b5e6f36b8a0e37f1953e12c438e01c58872e5fa/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/UnsafeFixedWidthAggregationMap.java#L206)):</sup>
    Done.
    
    ---
    
    <sup>**[sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala, 
line 1621 
\[r11\]](https://reviewable.io:443/reviews/apache/spark/7770#-Jvk1l80_3dvUKbI95U6)**
 ([raw 
file](https://github.com/apache/spark/blob/6aa2f7a8c2f4eb1de6281593326dce5a92d5c1e3/sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala#L1621)):</sup>
    sure
    
    ---
    
    
    Comments from the [review on 
Reviewable.io](https://reviewable.io:443/reviews/apache/spark/7770)
    <!-- Sent from Reviewable.io -->



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to