Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18840
@joseph-torres could you change the PR title to "[SPARK-21565]**[SS]**
Propagate metadata in attribute replacement"? We usually put the module name in
the PR title.
---
If your project
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18840
LGTM pending tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18848#discussion_r131486596
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala
---
@@ -638,4 +625,28 @@ object DataSource extends Logging
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18848
cc @gatorsmile
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18848
[SPARK-21374][CORE] Fix reading globbed paths from S3 into DF with disabled
FS cache
## What changes were proposed in this pull request?
This PR replaces #18623 to do some clean up
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18840#discussion_r131309601
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/streaming/EventTimeWatermarkSuite.scala
---
@@ -391,6 +391,30 @@ class EventTimeWatermarkSuite
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18840
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18623#discussion_r131283705
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala
---
@@ -132,7 +132,7 @@ case class DataSource
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18822
Thanks! Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18822
[SPARK-21546] dropDuplicates should ignore watermark when it's not a key
## What changes were proposed in this pull request?
When the watermark is not a column of `dropDuplicates`, right
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18803
Thanks! Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18803
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18796
Thanks! Merging to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18796
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18803
[SPARK-21597][SS]Fix a potential overflow issue in EventTimeStats
## What changes were proposed in this pull request?
This PR fixed a potential overflow issue in EventTimeStats
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18799
[SPARK-21596][SS]Ensure places calling HDFSMetadataLog.get check the return
value
## What changes were proposed in this pull request?
When I was investigating a flaky test, I realized
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18789#discussion_r130686972
--- Diff: sql/core/pom.xml ---
@@ -101,7 +101,7 @@
com.fasterxml.jackson.core
jackson-databind
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18790
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18723
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18723
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18723
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18388
@jinxing64 Sorry, I forgot to mention one request. Could you add a unit
test? Right now it's disabled so the new codes are not tested. It will help
avoid some obvious mistakes, such as the missing
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128882147
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java
---
@@ -130,11 +143,25 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128881079
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java
---
@@ -130,11 +143,25 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128879751
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java
---
@@ -130,11 +143,25 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128879607
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
---
@@ -96,18 +103,23 @@ public ManagedBuffer
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128879502
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
---
@@ -122,6 +134,7 @@ public void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128879315
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
---
@@ -53,9 +56,13
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128879045
--- Diff: docs/configuration.md ---
@@ -1809,6 +1809,14 @@ Apart from these, the following properties are also
available, and may be useful
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128878875
--- Diff: docs/configuration.md ---
@@ -1809,6 +1809,14 @@ Apart from these, the following properties are also
available, and may be useful
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128878596
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java
---
@@ -130,11 +143,25 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128878556
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java
---
@@ -118,6 +124,13 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128878335
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java
---
@@ -257,4 +257,7 @@ public Properties cryptoConf
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18707#discussion_r128876674
--- Diff: core/src/main/scala/org/apache/spark/ui/exec/ExecutorsTab.scala
---
@@ -140,6 +140,8 @@ class ExecutorsListener(storageStatusListener
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r128688973
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java
---
@@ -257,4 +257,11 @@ public Properties cryptoConf
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18642#discussion_r128632530
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala ---
@@ -242,16 +242,7 @@ class KeyValueGroupedDataset[K, V] private[sql
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18676
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18623
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18670
@ConeyLiu the network layer doesn't know how to serialize Throwable, or in
other words, it cannot use JavaSerializer in Spark core.
---
If your project is set up for it, you can reply
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18675#discussion_r128091870
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/statefulOperators.scala
---
@@ -79,6 +79,7 @@ trait StateStoreWriter extends
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18639
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18670
As @vanzin pointed it out, this is done on purpose. RpcResponseCallback and
RpcCallContext are in different modules. RpcResponseCallback is a low level
api, and RpcCallContext is on top
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18660
> I don't think we're actually trying to ship these values anywhere.
I see. They are static classes.
---
If your project is set up for it, you can reply to this email and have your
re
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18661
LGTM pending tests
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18660
Is it safe to just ignore them? Maybe we should recover them
`readExternal/read` method?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18629
LGTM. Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18638
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18638
cc @marmbrus
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18638
[SPARK-21421][SS]Add the query id as a local property to allow source and
sink using it
## What changes were proposed in this pull request?
Add the query id as a local property to allow
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18629#discussion_r127519438
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
---
@@ -172,7 +172,9 @@ private[state
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r127131801
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java
---
@@ -257,4 +257,31 @@ public Properties cryptoConf
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18388#discussion_r127131671
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/util/PooledByteBufAllocatorWithMetrics.java
---
@@ -0,0 +1,70
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18143
@ScrapCodes I think it should be bounded by
`spark.sql.kafkaConsumerCache.capacity`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
@devaraj-kavali Sorray. I forgot this PR. I will trigger a new run as
master has been updated a lot. Will set a reminder for me to merge this PR :)
---
If your project is set up for it, you can
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/14718#discussion_r126761513
--- Diff: common/network-common/pom.xml ---
@@ -45,6 +45,22 @@
commons-lang3
+
+ org.fusesource.leveldbjni
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/14718#discussion_r126761181
--- Diff: common/network-common/pom.xml ---
@@ -45,6 +45,22 @@
commons-lang3
+
+ org.fusesource.leveldbjni
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18593
@vanzin it doesn't work. I added scala-library into the test scope and
changed `org.apache.spark:spark-tags` back to the compile scope, but the build
didn't fail.
---
If your project is set up
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18593
cc @cloud-fan @jinxing64
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18593#discussion_r126532474
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
---
@@ -98,21 +96,16 @@ public ManagedBuffer
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18593#discussion_r126532371
--- Diff: common/network-common/pom.xml ---
@@ -90,7 +90,8 @@
org.apache.spark
spark-tags_${scala.binary.version
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18593
[SPARK-21369][Core]Don't use Scala Tuple in common/network-*
## What changes were proposed in this pull request?
Remove all usages of Scala Tuple from common/network-* projects
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/14718#discussion_r126521226
--- Diff: common/network-common/pom.xml ---
@@ -45,6 +45,22 @@
commons-lang3
+
+ org.fusesource.leveldbjni
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18565
@jinxing64 I submitted https://github.com/jinxing64/spark/pull/1 to your
repo to fix a potential file leak. Otherwise, this looks good to me.
---
If your project is set up for it, you can reply
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18565#discussion_r126291505
--- Diff:
core/src/main/scala/org/apache/spark/network/netty/NettyBlockTransferService.scala
---
@@ -53,6 +53,7 @@ private[spark] class
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18562
Thanks! LGTM. Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18562#discussion_r126264346
--- Diff: docs/structured-streaming-programming-guide.md ---
@@ -547,6 +549,19 @@ Here are the details of all the sources in Spark
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18388
@jinxing64 Since
[ExternalShuffleService](https://github.com/apache/spark/blob/a0fe32a219253f0abe9d67cf178c73daf5f6fcc1/core/src/main/scala/org/apache/spark/deploy/ExternalShuffleService.scala#L55
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18565#discussion_r126242905
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -151,15 +152,27 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18565#discussion_r126242576
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -151,15 +152,27 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18565#discussion_r126259778
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -151,15 +152,27 @@ private void
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18566
@jinxing64 yeah, please also update configuration.md.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18388
> there are 200K+ connections and 3.5M blocks(FileSegmentManagedBuffer)
being fetched.
Did you use a large `spark.shuffle.io.numConnectionsPerPeer`? If not, the
number of connections se
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18509
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18482
In a second thought, I think we don't need this PR. We can disable
`spark.reducer.maxReqSizeShuffleToMem` by default. Let's just document this
configuration will break old shuffle service
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18485
Thanks! Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18503#discussion_r126035228
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
---
@@ -350,20 +350,24 @@ private
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18485
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
LGTM. Pending tests. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18509#discussion_r125819776
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/EventTimeWatermarkExec.scala
---
@@ -81,7 +81,7 @@ class EventTimeStatsAccum
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r125817628
--- Diff: core/src/main/scala/org/apache/spark/deploy/master/Master.scala
---
@@ -1037,6 +1037,7 @@ private[deploy] object Master extends Logging
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r125817583
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -20,29 +20,29 @@ package org.apache.spark.util
import
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r125817503
--- Diff: core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
---
@@ -737,6 +737,7 @@ private[deploy] object Worker extends Logging
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r125817451
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -20,29 +20,29 @@ package org.apache.spark.util
import
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18461
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18485#discussion_r125781354
--- Diff: docs/structured-streaming-programming-guide.md ---
@@ -1971,8 +2011,23 @@ write.stream(aggDF, "memory", outputMode =
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18509#discussion_r125346130
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/EventTimeWatermarkExec.scala
---
@@ -81,7 +81,7 @@ class EventTimeStatsAccum
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18490
I cannot reproduce this issue. Could you provide a unit test to reproduce
this?
Anyway, I suggest using `kryo.register(classOf[HighlyCompressedMapStatus],
new KryoJavaSerializer
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18485#discussion_r125128901
--- Diff: docs/structured-streaming-programming-guide.md ---
@@ -1971,8 +2011,23 @@ write.stream(aggDF, "memory", outputMode =
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18485#discussion_r125128668
--- Diff: docs/structured-streaming-programming-guide.md ---
@@ -1922,6 +1953,15 @@ Not available in R.
+### Reporting Metrics using
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18485#discussion_r125127821
--- Diff: docs/index.md ---
@@ -88,13 +89,13 @@ options for deployment:
**Programming Guides:**
* [Quick Start](quick-start.html): a quick
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18485#discussion_r125127552
--- Diff: docs/_layouts/global.html ---
@@ -69,14 +69,14 @@
Programming Guides
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18490
I don't get it. Could you point out which place serializes MapStatus using
Kyro?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18478
Verified both Scala 2.10 and 2.11 build locally. Since Jenkins PR build
doesn't use Scala 2.10, I'm going to merge directly.
---
If your project is set up for it, you can reply to this email
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18478
[SPARK-21253][Core]Fix Scala 2.10 build
## What changes were proposed in this pull request?
A follow up PR to fix Scala 2.10 build for #18472
## How was this patch tested
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16989#discussion_r124959257
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -126,4 +150,38 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18472#discussion_r124955416
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java
---
@@ -104,15 +106,31 @@ public void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r124944873
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -26,27 +26,34 @@ import org.apache.spark.internal.Logging
701 - 800 of 6049 matches
Mail list logo