Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r124940352
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -26,27 +26,34 @@ import org.apache.spark.internal.Logging
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18357#discussion_r124940265
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -26,27 +26,34 @@ import org.apache.spark.internal.Logging
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18466
FYI, I'm fixing the root issue in https://github.com/apache/spark/pull/18472
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18472
cc @vanzin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18472
[SPARK-21253][Core]Fix a bug that StreamCallback may not be notified if
network errors happen
## What changes were proposed in this pull request?
If a network error happens before
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18400
LGTM. Merging to master. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16989#discussion_r124913946
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -126,4 +150,38 @@ private void
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16989#discussion_r124908306
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/OneForOneBlockFetcher.java
---
@@ -126,4 +150,38 @@ private void
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18466
@dongjoon-hyun I have not yet figured out the root cause of this issue. The
major reason to disable it is this feature breaks old shuffle service.
---
If your project is set up for it, you can
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18466
@wangyum I submitted #18466 to disable this feature via configuration
instead. You will be the commit author when it's merged.
---
If your project is set up for it, you can reply to this email
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18467#discussion_r124886725
--- Diff: docs/configuration.md ---
@@ -529,14 +529,6 @@ Apart from these, the following properties are also
available, and may be useful
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/16989#discussion_r124886621
--- Diff:
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
---
@@ -95,6 +97,25 @@ public ManagedBuffer
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18467
cc @JoshRosen As it's impossible to safely revert #16989, I just changed
the default value to Long.MaxValue.
---
If your project is set up for it, you can reply to this email and have your
reply
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18467
[SPARK-21253][Core]Disable spark.reducer.maxReqSizeShuffleToMem
## What changes were proposed in this pull request?
Disable spark.reducer.maxReqSizeShuffleToMem because it breaks the old
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18461#discussion_r124876931
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamExecution.scala
---
@@ -357,7 +357,7 @@ class StreamExecution
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18461#discussion_r124876835
--- Diff:
core/src/test/scala/org/apache/spark/util/UninterruptibleThreadSuite.scala ---
@@ -68,7 +68,6 @@ class UninterruptibleThreadSuite extends
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18461
cc @tdas
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18461#discussion_r124679045
--- Diff:
core/src/test/scala/org/apache/spark/util/UninterruptibleThreadSuite.scala ---
@@ -80,8 +79,8 @@ class UninterruptibleThreadSuite extends
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18461
[SPARK-21248][SS]The clean up codes in StreamExecution should not be
interrupted
## What changes were proposed in this pull request?
This PR uses `runUninterruptibly` to avoid
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18450#discussion_r124610327
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLListener.scala ---
@@ -314,6 +314,8 @@ class SQLListener(conf: SparkConf) extends
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18426
LGTM. Merging to master. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18426#discussion_r124171594
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/IncrementalExecution.scala
---
@@ -47,11 +47,16 @@ class IncrementalExecution
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18426#discussion_r124123607
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/IncrementalExecution.scala
---
@@ -47,11 +47,16 @@ class IncrementalExecution
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18426
nit: Could you add `[SS]` to the title?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18426#discussion_r124122009
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/IncrementalExecution.scala
---
@@ -47,11 +47,16 @@ class IncrementalExecution
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18364
LGTM. Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18400
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18400
ok to test.
cc @JoshRosen
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18402
LGTM. Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18383
@dijingran could you fix the JIRA number in the PR title?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18347
@lubozhan do you have an JIRA account?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18347
LGTM. Merging to master. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18381
Thanks! Merging to master, 2.2 and 2.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18381
[SPARK-21167][SS]Decode the path generated by File sink to handle special
characters
## What changes were proposed in this pull request?
Decode the path generated by File sink to handle
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
@devaraj-kavali SparkUncaughtExceptionHandler is a singleton so you cannot
get the configuration from SparkConf.
---
If your project is set up for it, you can reply to this email and have your
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18355
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18355
LGTM. Just one nit.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18355#discussion_r123357731
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/state/StateStoreSuite.scala
---
@@ -408,12 +413,60 @@ class StateStoreSuite
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
LGTM. I thought we already did it :(
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18357
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18347
Looks pretty good. Left some comments.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18347
ok to test.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18347#discussion_r123322259
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/console.scala
---
@@ -60,5 +70,23 @@ class ConsoleSinkProvider extends
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18347#discussion_r123321846
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/console.scala
---
@@ -51,7 +53,15 @@ class ConsoleSink(options: Map[String
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18365
LGTM. Merging to master. As this is a slight behavior change (from no-op to
throwing exception), I will merge this to branch 2.2 only if RC5 fails.
---
If your project is set up for it, you can
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18365
Good catch. Could you also fix RateSourceProvider?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18363
@assafmendelson Thanks! I merged it. Could you close this PR, please?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18363
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18347
It's better to make `ConsoleSinkProvider` implement
`CreatableRelationProvider` instead.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18342
@assafmendelson do you have an Apache JIRA account? I'm going to assign
this ticket to you.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18342
LGTM. Merging to master and 2.2. @assafmendelson could you also submit a PR
to fix branch-2.1? Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18306#discussion_r122092151
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
---
@@ -690,6 +690,7 @@ class SparkSession private(
* @since 2.0.0
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18306#discussion_r122091309
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryManager.scala
---
@@ -321,6 +321,17 @@ class StreamingQueryManager private
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18199
I also merged this to branch 2.2 since this is a separated feature.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18199
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18199
> Just noticed that we don't have a nice toString method for this source.
Can be added in a follow up.
Let me just do it now. Since it's pretty easy.
---
If your project is set
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18199
@brkyvz I changed to use double to simplify the codes.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r121224007
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,279 @@
+/*
+ * Licensed
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r121200038
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/RateSourceSuite.scala
---
@@ -0,0 +1,195 @@
+/*
+ * Licensed
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18143
@brkyvz I think consumers and producers have different cache strategy and
seem sharing the same interface is weird. We can share the same producer in
multiple tasks at the same time, but that's
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120508750
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -199,13 +199,52 @@ class RateStreamSource
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18208
Thanks. Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120243433
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,240 @@
+/*
+ * Licensed
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120222903
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,240 @@
+/*
+ * Licensed
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120219958
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,240 @@
+/*
+ * Licensed
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120212469
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,208 @@
+/*
+ * Licensed
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120200602
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/RateSourceProvider.scala
---
@@ -0,0 +1,229 @@
+/*
+ * Licensed
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18208
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18208
LGTM. Pending tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18199#discussion_r120177137
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/RateSourceSuite.scala
---
@@ -0,0 +1,148 @@
+/*
+ * Licensed
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18199
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18199
[SPARK-20979][SS]Add RateSource to generate values for tests and benchmark
## What changes were proposed in this pull request?
This PR adds RateSource for Structured Streaming so
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18166
Sorry. It's caused by another commit.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18166
@vanzin Some REPL tests start to fail after this patch, such as
https://spark-tests.appspot.com/builds/spark-master-test-sbt-hadoop-2.6/3045
could you take a look?
---
If your project
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18177
Thanks! Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18180
[SPARK-20957][SS][Tests]Fix o.a.s.sql.streaming.StreamingQueryManagerSuite
listing
## What changes were proposed in this pull request?
When stopping StreamingQuery, StreamExecution
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18148
@vanzin the whole catalyst project is private.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18177
@srowen Considering Spark keeps 1000 stages in UI, if each stage has 1
tasks, the duplicated strings will be a lot. I observed this, about 150MB in a
heap dump. Of cause, it's not significantly
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18179
cc @tdas
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18179#discussion_r119752541
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala ---
@@ -614,6 +614,25 @@ class StreamSuite extends StreamTest
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18149#discussion_r119752487
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala ---
@@ -617,6 +617,25 @@ class StreamSuite extends StreamTest
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18179
[SPARK-20894][SS] Resolve the checkpoint location in driver and use the
resolved path in state store (branch-2.2)
## What changes were proposed in this pull request?
Backport #18149
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18166
FYI, 2.1 and 2.0 are broken now
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18177
cc @JoshRosen
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18177#discussion_r119725177
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/UIData.scala ---
@@ -155,8 +165,8 @@ private[spark] object UIData {
index
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18177
[SPARK-20955][Core]Intern "executorId" to reduce the memory usage
## What changes were proposed in this pull request?
In [this
line](https://github.com/apache/
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18168
> Lgtm, I wonder if an Error is thrown anywhere else. Really shouldn't be.
@srowen IllegalAccessError are not used in other places. However, I didn't
search other types of err
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18168
Thanks! Merging to master, 2.2, 2.1 and 2.0.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18149
Thanks! Merging to master and 2.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18168
cc @cloud-fan
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18168
[SPARK-20940][Core]Replace IllegalAccessError with IllegalStateException
## What changes were proposed in this pull request?
`IllegalAccessError` is a fatal error (a subclass
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18107
Thanks! Merging to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user zsxwing opened a pull request:
https://github.com/apache/spark/pull/18149
[SPARK-20894][SS]Resolve the checkpoint location in driver and use the
resolved path in state store
## What changes were proposed in this pull request?
When the user runs a Structured
Github user zsxwing commented on the issue:
https://github.com/apache/spark/pull/18107
LGTM pending tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r118608662
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -828,6 +837,8 @@ class SQLConf extends Serializable with Logging
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r119014976
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/statefulOperators.scala
---
@@ -273,27 +333,34 @@ case class
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r119016141
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/statefulOperators.scala
---
@@ -165,54 +189,88 @@ case class StateStoreSaveExec
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r118802674
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -552,6 +552,15 @@ object SQLConf {
.booleanConf
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r119014965
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/statefulOperators.scala
---
@@ -253,6 +311,8 @@ case class StateStoreSaveExec
Github user zsxwing commented on a diff in the pull request:
https://github.com/apache/spark/pull/18107#discussion_r119014982
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/statefulOperators.scala
---
@@ -304,8 +371,9 @@ case class
801 - 900 of 6049 matches
Mail list logo