Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/3768
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4008#issuecomment-70804703
@JoshRosen Since my local branch is out of date and this PR contains merge
conflicts, I will open a [new PR](https://github.com/apache/spark/pull/4135)
and close this
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4008
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/2562
[SPARK-3712][STREAMING]: add a new UpdateDStream to update a rdd dynamically
Maybe, we can achieve the aim by using "forEachRdd" function. But it is
weird in this way, because I need
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2562#issuecomment-57083416
Test failure appears to be unrelated to my patch.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2562#issuecomment-57087576
@jerryshao Thanks for your comments! I want to abstract an independent
DStream to achieve the aim. I feel it is weird to update a rdd by passing a
closure. Maybe, this
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/2562
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/2574
[SPARK-3719][CORE]:"complete/failed stages" is better to show the total ...
...number of stages
You can merge this pull request into a Git repository by running:
$ git pull https://
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2574#discussion_r18561760
--- Diff:
core/src/main/scala/org/apache/spark/ui/jobs/JobProgressPage.scala ---
@@ -70,11 +72,11 @@ private[ui] class JobProgressPage(parent
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2574#discussion_r18622733
--- Diff:
core/src/main/scala/org/apache/spark/ui/jobs/JobProgressPage.scala ---
@@ -70,11 +72,11 @@ private[ui] class JobProgressPage(parent
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2574#issuecomment-58735669
@JoshRosen Sorry for my misunderstanding, I will correct it as soon as
possible.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2679#issuecomment-58885039
@ankurdave I have some doubts, but not about this patch. In [GraphX OSDI
paper](http://ankurdave.com/dl/graphx-osdi14.pdf) , I find that you have
implemented a "m
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2679#issuecomment-59037906
@ankurdave I see. And I think it is worthy to provide a memory-based
shuffle manager in some cases, like sufficient memory resources, stringent
performance requirement
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-76650327
@JoshRosen Could you please take a look again, thank you.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-76663454
@JoshRosen Could you please take a look again, thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-77514182
@pwendell Could you please take a look again, thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-78000658
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-78015735
Some timeout errors happened. Could you please take a time to check and
review it again?
---
If your project is set up for it, you can reply to this email and have
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-78201163
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-78468550
@JoshRosen, thanks for your patience. It occurred to me that we may check
when to terminate the `receiver` in `ReceiverSupervisor`. Then the condition to
stop the
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-78736965
@JoshRosen OK, I will roll back to the original approach and do some
impromvements :)
---
If your project is set up for it, you can reply to this email and have your
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2249#issuecomment-81433144
@ankurdave This PR has gone stale. Since GraphX has graduated from alpha,
do we need to close this?
---
If your project is set up for it, you can reply to this email
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/2249
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-82363317
close and resolve the merge conflict.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-82844745
@JoshRosen Could you please take a look again, thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-87908164
@JoshRosen Sorry for my laziness. I will update it as soon as possible
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-87967820
@JoshRosen Your comments are reasonable, and I have improved related code
just as what you pointed. For the test suite, I just check if the state of
`Receiver` and
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/4135
[SPARK-5205][Streaming]:Inconsistent behaviour between Streaming job and
others, when click kill link in WebUI
The "kill" link is used to kill a stage in job. It works in any kinds of
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-88013418
wait for a moment
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-88342325
@JoshRosen Your comments are reasonable, and I have improved the relevant
codes just what you pointed. And more, I add some unit tests about the
behavior about
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4135#issuecomment-90476533
Ha~llo~, @JoshRosen
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3763
[SPARK-4920][UI]:current spark version in UI is not striking.
It is not convenient to see the Spark version. We can keep the same style
with Spark website.
!https
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3768
[SPARK-4920][UI]: back port the PR-3763 to branch 1.1
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark branch-1.1-1223
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/3768#issuecomment-67926060
There are two irrelevant test failures.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3912
[SPARK-5107][Streaming][Log]: A trick log info for the start of Receiver
Receiver will register itself whenever it begins to start. But, it is trick
to log the same information. Especially, at the
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/3912#discussion_r22572546
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/scheduler/ReceiverTracker.scala
---
@@ -138,7 +140,17 @@ class ReceiverTracker(ssc
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/3912#discussion_r22572535
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/receiver/ReceiverSupervisorImpl.scala
---
@@ -73,14 +73,16 @@ private[streaming] class
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/3912#discussion_r22575229
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/scheduler/ReceiverTracker.scala
---
@@ -137,8 +141,24 @@ class ReceiverTracker(ssc
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3930
[SPARK-5131][Streaming][DOC]: There is a discrepancy in WAL implementation
and configuration doc.
There is a discrepancy in WAL implementation and configuration doc.
You can merge this pull
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/3930#issuecomment-68996611
@srowen oops, I missed.
.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/3912#discussion_r22577352
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/scheduler/ReceiverTracker.scala
---
@@ -137,8 +141,24 @@ class ReceiverTracker(ssc
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/3912
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3365
[SPARK-4488][PySpark] Add control over map-side aggregation
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark master-clean-141119
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/3365
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen reopened a pull request:
https://github.com/apache/spark/pull/3365
[SPARK-4488][PySpark] Add control over map-side aggregation
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark master-clean
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/3365
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/3366
[SPARK-4488][PySpark] Add control over map-side aggregation
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark master-pyspark
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2574#issuecomment-63651117
@JoshRosen [[SPARK-4168][WebUI]
](https://github.com/apache/spark/commit/97a466eca0a629f17e9662ca2b59eeca99142c54)
The patch solved the same problem, and I will close
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/2574
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2249#issuecomment-63654065
@ankurdave Hi, can you review it again. Thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/3366#issuecomment-63831692
@davies Could you help reviewing this patch? Thank you!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4253#issuecomment-72421951
@jkbradley IMHO, we do not need to override the `isCheckpointed()` in
EdgeRDDImpl and VertexRDDImpl, and only define a normal `isCheckpointed()`.
IIUC, the func
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/4522
[SPARK-5732][CORE]:Add an option to print the spark version in spark script.
Naturally, we need to add an option to print the spark version in spark
script. It is pretty common in script tool
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/4522#issuecomment-73839883
An irrelevant test failure in `DirectKafkaStreamSuite` introduced by
[PR](4384)
---
If your project is set up for it, you can reply to this email and have your
reply
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/4522#discussion_r24490981
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -413,10 +413,13 @@ private[spark] class SparkSubmitArguments(args
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/4522#discussion_r24491713
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -413,10 +413,13 @@ private[spark] class SparkSubmitArguments(args
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/4522#discussion_r24495488
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -413,10 +413,13 @@ private[spark] class SparkSubmitArguments(args
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/1356#issuecomment-48691344
@pwendell yeah, this is not a handsome way to resolve the bug. My fix is a
compromised way. Actually, there are no frequent get/put opertions in
blockManager when
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/1356#issuecomment-48691872
@pwendell yeah, it is not a handsome way to resolve the bug. My fix is a
compromise way. Actualy, is will not cause frequent put/get opertions in
blockManager when
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/1429
Bug Fix: In yarn-cluster mode, ApplicationMaster does not clean up corre...
...ctly at the end of the job if users call sc.stop manually. There are two
minor bugs:
1. At the end of a job
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/1429#issuecomment-49751553
@tgravescs ok, and any suggestions?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/1696
Spark Shuffleï¼use growth rate to predict if need to spill
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark master
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/1696#issuecomment-50958278
@rxin Thanks for your attention, I have updated my jira.
https://issues.apache.org/jira/browse/SPARK-2773
---
If your project is set up for it, you can reply to this
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/4135
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/10023
[SPARK-12031][Core][BUG]: Integer overflow when do sampling
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/uncleGen/spark 1.6-bugfix
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/10023#issuecomment-160263743
Jenkins, test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/10023#issuecomment-160270490
[info] - failing to fetch classes from HTTP server should not leak
resources (SPARK-6209) *** FAILED *** (1 second, 392 milliseconds)
[info
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/10023#discussion_r46105601
--- Diff:
core/src/main/scala/org/apache/spark/util/random/SamplingUtils.scala ---
@@ -52,16 +52,21 @@ private[spark] object SamplingUtils
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/10023#issuecomment-161149946
@mengxr In my case, I do 10TB data sort in 96 partitions. When do range
partition, "java.lang.IllegalArgumentException: n must be positive" exception
was t
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/10023#issuecomment-161150352
@mengxr sorry, was 48 executors with 2 cores, and 1024 partitions.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/10023#issuecomment-161971911
@srowen @mengxr any problems?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/16656
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/17463
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/17395
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/17896
[SPARK-20373][SQL][SS] Batch queries with
'Dataset/DataFrame.withWatermark()` does not execute
## What changes were proposed in this pull request?
Any Dataset/DataFrame batch query
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17896
cc @zsxwing and @tdas
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17395
@HyukjinKwon Sorry for the long absence. I will keep online for next period
of time. Please give me some time.
---
If your project is set up for it, you can reply to this email and have your
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17896
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17896
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user uncleGen opened a pull request:
https://github.com/apache/spark/pull/17913
[SPARK-20672][SS] Keep the `isStreaming` property in triggerLogicalPlan in
Structured Streaming
## What changes were proposed in this pull request?
In Structured Streaming, the
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17896
Depends upon:
[SPARK-20672](https://issues.apache.org/jira/browse/SPARK-20672)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/17913#discussion_r115415483
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamingRelation.scala
---
@@ -64,8 +64,20 @@ case class
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/17896#discussion_r115415668
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
---
@@ -2457,6 +2457,19 @@ object CleanupAliases extends Rule
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/17896#discussion_r115415803
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
---
@@ -2457,6 +2457,19 @@ object CleanupAliases extends Rule
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17913
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17913
cc @zsxwing
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/17913#discussion_r115659132
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamingRelation.scala
---
@@ -48,7 +48,7 @@ case class StreamingRelation
Github user uncleGen commented on a diff in the pull request:
https://github.com/apache/spark/pull/17917#discussion_r115659920
--- Diff:
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaRelation.scala
---
@@ -143,4 +143,6 @@ private[kafka010] class
Github user uncleGen commented on the issue:
https://github.com/apache/spark/pull/17913
@zsxwing Great! Close this pr then.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user uncleGen closed the pull request at:
https://github.com/apache/spark/pull/17913
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
401 - 500 of 503 matches
Mail list logo