pralabhkumar commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888632502
##
python/pyspark/tests/test_shuffle.py:
##
@@ -54,6 +63,49 @@ def test_medium_dataset(self):
self.assertTrue(m.spills >= 1)
self.assertEqual(sum(
mridulm commented on code in PR #36734:
URL: https://github.com/apache/spark/pull/36734#discussion_r888654676
##
core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:
##
@@ -1885,6 +1885,16 @@ private[spark] class DAGScheduler(
mapOutputTracker.
MaxGekk closed pull request #36708: [SPARK-37623][SQL] Support ANSI Aggregate
Function: regr_intercept
URL: https://github.com/apache/spark/pull/36708
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
MaxGekk commented on PR #36708:
URL: https://github.com/apache/spark/pull/36708#issuecomment-1145630828
+1, LGTM. Merging to master.
Thank you, @beliefer and @cloud-fan for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to G
MaxGekk closed pull request #36752: [SPARK-39259][SQL][3.3] Evaluate timestamps
consistently in subqueries
URL: https://github.com/apache/spark/pull/36752
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
MaxGekk commented on PR #36752:
URL: https://github.com/apache/spark/pull/36752#issuecomment-1145620433
+1, LGTM. Merging to 3.3.
Thank you, @olaky.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
xuanyuanking opened a new pull request, #36757:
URL: https://github.com/apache/spark/pull/36757
### What changes were proposed in this pull request?
Compare the 3.3.0 API doc with the latest release version 3.2.1. Fix the
following issues:
* Add missing Since annotation
pralabhkumar commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888632502
##
python/pyspark/tests/test_shuffle.py:
##
@@ -54,6 +63,49 @@ def test_medium_dataset(self):
self.assertTrue(m.spills >= 1)
self.assertEqual(sum(
pralabhkumar commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888632502
##
python/pyspark/tests/test_shuffle.py:
##
@@ -54,6 +63,49 @@ def test_medium_dataset(self):
self.assertTrue(m.spills >= 1)
self.assertEqual(sum(
pralabhkumar commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888632502
##
python/pyspark/tests/test_shuffle.py:
##
@@ -54,6 +63,49 @@ def test_medium_dataset(self):
self.assertTrue(m.spills >= 1)
self.assertEqual(sum(
sadikovi commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r887638039
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -150,6 +150,9 @@ object JdbcUtils extends Logging with SQLConfHelper {
MaxGekk closed pull request #36714: [SPARK-39320][SQL] Support aggregate
function `MEDIAN`
URL: https://github.com/apache/spark/pull/36714
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specifi
sadikovi commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r887638039
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -150,6 +150,9 @@ object JdbcUtils extends Logging with SQLConfHelper {
AmplabJenkins commented on PR #36740:
URL: https://github.com/apache/spark/pull/36740#issuecomment-1145590291
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
AmplabJenkins commented on PR #36741:
URL: https://github.com/apache/spark/pull/36741#issuecomment-1145590268
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
AmplabJenkins commented on PR #36745:
URL: https://github.com/apache/spark/pull/36745#issuecomment-1145590247
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
sadikovi commented on code in PR #36745:
URL: https://github.com/apache/spark/pull/36745#discussion_r888606358
##
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:
##
@@ -41,21 +41,28 @@ import org.apache.spark.sql.types.{MetadataBuilder
HyukjinKwon commented on code in PR #36660:
URL: https://github.com/apache/spark/pull/36660#discussion_r888603854
##
python/pyspark/pandas/groupby.py:
##
@@ -759,6 +759,99 @@ def skew(scol: Column) -> Column:
bool_to_numeric=True,
)
+# TODO: 'axis', '
sadikovi commented on code in PR #36745:
URL: https://github.com/apache/spark/pull/36745#discussion_r888603145
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala:
##
@@ -122,7 +122,7 @@ abstract class SessionCatalogSuite extends Analys
HyukjinKwon commented on code in PR #36660:
URL: https://github.com/apache/spark/pull/36660#discussion_r888603267
##
python/pyspark/pandas/groupby.py:
##
@@ -805,7 +874,7 @@ def all(self, skipna: bool = True) -> FrameLike:
5 False
"""
groupkey_names =
wangyum commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145534550
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
wangyum closed pull request #36750: [SPARK-29260][SQL] Support `ALTER DATABASE
SET LOCATION` if HMS supports
URL: https://github.com/apache/spark/pull/36750
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HyukjinKwon closed pull request #36736: [SPARK-39351][SQL] SHOW CREATE TABLE
should redact properties
URL: https://github.com/apache/spark/pull/36736
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
HyukjinKwon commented on PR #36736:
URL: https://github.com/apache/spark/pull/36736#issuecomment-1145530299
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon opened a new pull request, #36756:
URL: https://github.com/apache/spark/pull/36756
### What changes were proposed in this pull request?
https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/builds/43740704
AppVeyor build is being failed because of the lack
wangyum commented on code in PR #36755:
URL: https://github.com/apache/spark/pull/36755#discussion_r888564235
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala:
##
@@ -288,7 +288,13 @@ object InjectRuntimeFilter extends Rule[Logical
AngersZh commented on PR #36736:
URL: https://github.com/apache/spark/pull/36736#issuecomment-1145528214
ping @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
AngersZh commented on code in PR #36754:
URL: https://github.com/apache/spark/pull/36754#discussion_r888563938
##
sql/catalyst/src/main/java/org/apache/spark/sql/util/NumericHistogram.java:
##
@@ -44,10 +44,14 @@
* 4. In Hive's code, the method [[merge()] pass a serializ
sigmod commented on code in PR #36755:
URL: https://github.com/apache/spark/pull/36755#discussion_r888560970
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala:
##
@@ -288,7 +288,13 @@ object InjectRuntimeFilter extends Rule[LogicalP
HyukjinKwon commented on PR #36683:
URL: https://github.com/apache/spark/pull/36683#issuecomment-1145517326
Gentle ping for a review :-). I know it has some trade-off but I believe
this addresses more common cases and benefit more users.
--
This is an automated message from the Apache Git
AmplabJenkins commented on PR #36752:
URL: https://github.com/apache/spark/pull/36752#issuecomment-1145517157
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
AmplabJenkins commented on PR #36753:
URL: https://github.com/apache/spark/pull/36753#issuecomment-1145517135
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
beliefer commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r888554738
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/percentiles.scala:
##
@@ -359,6 +359,32 @@ case class Percentile(
)
}
+// scala
wangyum commented on PR #36755:
URL: https://github.com/apache/spark/pull/36755#issuecomment-1145503284
cc @sigmod @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment
wangyum opened a new pull request, #36755:
URL: https://github.com/apache/spark/pull/36755
### What changes were proposed in this pull request?
This PR moves `RewritePredicateSubquery` into `InjectRuntimeFilter`.
### Why are the changes needed?
Reduce the number of `Rewri
dongjoon-hyun commented on PR #36697:
URL: https://github.com/apache/spark/pull/36697#issuecomment-1145480808
Thank you, @pan3793 , @sunchao , @cloud-fan !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
HyukjinKwon commented on PR #36701:
URL: https://github.com/apache/spark/pull/36701#issuecomment-114542
LGTM otherwise.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888519926
##
python/pyspark/tests/test_shuffle.py:
##
@@ -117,6 +169,37 @@ def legit_merge_combiners(x, y):
m.mergeCombiners(map(lambda x_y1: (x_y1[0], [x_y1[1]])
HyukjinKwon commented on code in PR #36701:
URL: https://github.com/apache/spark/pull/36701#discussion_r888519237
##
python/pyspark/tests/test_shuffle.py:
##
@@ -54,6 +63,49 @@ def test_medium_dataset(self):
self.assertTrue(m.spills >= 1)
self.assertEqual(sum(s
HyukjinKwon closed pull request #36754: [SPARK-39367][DOCS][SQL] Review and fix
issues in Scala/Java API docs of SQL module
URL: https://github.com/apache/spark/pull/36754
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use t
HyukjinKwon commented on PR #36754:
URL: https://github.com/apache/spark/pull/36754#issuecomment-1145475718
Merged to master and branch-3.3.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
sunchao commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145471603
> Lastly, could you make the PR description up-to-date? For example, the
following seems to need some changes.
>
> > This PR removes the check so that the command works as long as t
github-actions[bot] closed pull request #35329: [SPARK-33326][SQL] Update
Partition statistic parameters after ANALYZE TABLE ... PARTITION()
URL: https://github.com/apache/spark/pull/35329
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888508050
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -355,14 +355,17 @@ private[hive] class HiveClientImpl(
}
override d
HyukjinKwon commented on code in PR #36693:
URL: https://github.com/apache/spark/pull/36693#discussion_r888508847
##
sql/catalyst/src/main/scala/org/apache/spark/sql/AnalysisException.scala:
##
@@ -54,10 +72,34 @@ class AnalysisException protected[sql] (
messageParameters
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888508050
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -355,14 +355,17 @@ private[hive] class HiveClientImpl(
}
override d
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888508050
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -355,14 +355,17 @@ private[hive] class HiveClientImpl(
}
override d
wangyum commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888507803
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -355,14 +355,17 @@ private[hive] class HiveClientImpl(
}
override def alt
viirya commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888506874
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -355,14 +355,17 @@ private[hive] class HiveClientImpl(
}
override def alte
viirya commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888506364
##
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala:
##
@@ -1628,8 +1628,8 @@ object QueryCompilationErrors extends QueryErrorsBase {
dtenedor commented on PR #36745:
URL: https://github.com/apache/spark/pull/36745#issuecomment-1145458556
@sadikovi thanks for your review, these are helpful ideas! Please look again
when you have time.
--
This is an automated message from the Apache Git Service.
To respond to the message,
dtenedor commented on code in PR #36745:
URL: https://github.com/apache/spark/pull/36745#discussion_r888505739
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala:
##
@@ -427,6 +428,7 @@ class SessionCatalog(
tableDefinition.copy(iden
dtenedor commented on code in PR #36745:
URL: https://github.com/apache/spark/pull/36745#discussion_r888505435
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -231,4 +232,18 @@ object ResolveDefaultColumns {
}
dongjoon-hyun commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145456039
Lastly, could you make the PR description up-to-date? For example, the
following?
> This PR removes the check so that the command works as long as the Hive
version used by the HM
sunchao commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888490510
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Seq[Stri
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888477684
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Se
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888482892
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Se
sunchao commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888482028
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Seq[Stri
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888478943
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Se
sunchao commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888477757
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Seq[Stri
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888477684
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Se
dongjoon-hyun commented on code in PR #36750:
URL: https://github.com/apache/spark/pull/36750#discussion_r888476932
##
sql/hive/src/test/scala/org/apache/spark/sql/hive/client/HiveClientSuite.scala:
##
@@ -165,19 +165,19 @@ class HiveClientSuite(version: String, allVersions:
Se
sunchao commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145425002
The `ALTER DATABASE SET LOCATION` command will change the default location
for new tables created afterwards. So in step 2) above, if table location is
not explicitly specified, the new t
sunchao commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145407388
Fixed. @viirya pls take another look, thanks.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
holdenk commented on PR #36434:
URL: https://github.com/apache/spark/pull/36434#issuecomment-1145371331
Update: with the change for increased resilence it passes integration tests
on my machine.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
amaliujia commented on code in PR #36586:
URL: https://github.com/apache/spark/pull/36586#discussion_r888383953
##
sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:
##
@@ -299,15 +313,18 @@ class CatalogSuite extends SharedSparkSession with
AnalysisTest
amaliujia commented on code in PR #36586:
URL: https://github.com/apache/spark/pull/36586#discussion_r888374320
##
sql/core/src/main/scala/org/apache/spark/sql/catalog/interface.scala:
##
@@ -64,12 +64,26 @@ class Database(
@Stable
class Table(
val name: String,
-@Nul
JoshRosen commented on PR #36751:
URL: https://github.com/apache/spark/pull/36751#issuecomment-1145291954
If I recall, I think the original motivation for this "release all locks at
the end of the task" code was to prevent indefinite "pin leaks" if tasks fail
to properly release locks (e.g.
gengliangwang commented on code in PR #36754:
URL: https://github.com/apache/spark/pull/36754#discussion_r888368841
##
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala:
##
@@ -46,7 +46,7 @@ import org.apache.spark.sql.types._
* As commands a
gengliangwang commented on code in PR #36754:
URL: https://github.com/apache/spark/pull/36754#discussion_r888303216
##
sql/catalyst/src/main/java/org/apache/spark/sql/util/NumericHistogram.java:
##
@@ -44,10 +44,14 @@
* 4. In Hive's code, the method [[merge()] pass a seriali
gengliangwang opened a new pull request, #36754:
URL: https://github.com/apache/spark/pull/36754
### What changes were proposed in this pull request?
Compare the 3.3.0 API doc with the latest release version 3.2.1. Fix the
following issues:
* Add missing Since annotatio
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888299878
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -655,6 +744,156 @@ public void registerExecutor(String app
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888299709
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -655,6 +744,156 @@ public void registerExecutor(String app
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888299473
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -655,6 +744,156 @@ public void registerExecutor(String app
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888299188
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -992,6 +1233,45 @@ AppShufflePartitionInfo getPartitionInf
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888298796
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -576,6 +661,7 @@ public MergeStatuses
finalizeShuffleMerg
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888298391
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -576,6 +661,7 @@ public MergeStatuses
finalizeShuffleMerg
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888298248
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -209,9 +246,16 @@ private AppShufflePartitionInfo
getOrCr
zhouyejoe commented on PR #35906:
URL: https://github.com/apache/spark/pull/35906#issuecomment-1145216328
>
Added a flag in closeAndDeletePartitionFilesIfNeeded to check whether DB
cleanup is needed or not.
--
This is an automated message from the Apache Git Service.
To respond to the
MaxGekk commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r888289663
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/percentiles.scala:
##
@@ -359,6 +359,32 @@ case class Percentile(
)
}
+// scalas
MaxGekk commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r888289663
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/percentiles.scala:
##
@@ -359,6 +359,32 @@ case class Percentile(
)
}
+// scalas
olaky opened a new pull request, #36753:
URL: https://github.com/apache/spark/pull/36753
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How was t
olaky opened a new pull request, #36752:
URL: https://github.com/apache/spark/pull/36752
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How was t
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888286186
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -536,9 +619,11 @@ public MergeStatuses
finalizeShuffleMer
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888284976
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -342,6 +389,29 @@ void closeAndDeletePartitionFilesIfNeede
dtenedor commented on PR #36672:
URL: https://github.com/apache/spark/pull/36672#issuecomment-1145200667
@HyukjinKwon the CI passes now :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
MaxGekk closed pull request #36749: [SPARK-39295][DOCS][PYTHON][3.3] Improve
documentation of pandas API supported list
URL: https://github.com/apache/spark/pull/36749
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
MaxGekk commented on PR #36749:
URL: https://github.com/apache/spark/pull/36749#issuecomment-1145199197
+1, LGTM. Merging to 3.3.
Thank you, @beobest2 and @HyukjinKwon @Yikun for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log o
MaxGekk commented on PR #36654:
URL: https://github.com/apache/spark/pull/36654#issuecomment-1145197543
@olaky Could you open a separate PRs with backports to branch-3.3 and
branch-3.2 (according to SPARK-39259, 3.2 has this issue).
Congratulations with the first contribution to Apach
MaxGekk closed pull request #36654: [SPARK-39259][SQL] Evaluate timestamps
consistently in subqueries
URL: https://github.com/apache/spark/pull/36654
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
MaxGekk commented on PR #36654:
URL: https://github.com/apache/spark/pull/36654#issuecomment-1145192594
+1, LGTM. Merging to master, 3.3, 3.2.
Thank you, @olaky.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
MaxGekk commented on PR #36654:
URL: https://github.com/apache/spark/pull/36654#issuecomment-1145192596
+1, LGTM. Merging to master, 3.3, 3.2.
Thank you, @olaky.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
akpatnam25 commented on code in PR #36734:
URL: https://github.com/apache/spark/pull/36734#discussion_r888273505
##
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala:
##
@@ -4402,12 +4501,20 @@ object DAGSchedulerSuite {
def makeMapStatus(host: String, re
viirya commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145188518
`AlterNamespaceSetLocationSuite` seems failed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
attilapiros commented on code in PR #36512:
URL: https://github.com/apache/spark/pull/36512#discussion_r888265229
##
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##
@@ -933,46 +935,56 @@ private[spark] class BlockManager(
})
Some(new Blo
otterc commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r888251830
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -342,6 +389,29 @@ void closeAndDeletePartitionFilesIfNeeded(
otterc commented on code in PR #36734:
URL: https://github.com/apache/spark/pull/36734#discussion_r888231614
##
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala:
##
@@ -4402,12 +4501,20 @@ object DAGSchedulerSuite {
def makeMapStatus(host: String, reduce
hvanhovell commented on PR #36751:
URL: https://github.com/apache/spark/pull/36751#issuecomment-1145135863
This is still a WIP. If we think this is the right thing to do, then I will
add some tests.
--
This is an automated message from the Apache Git Service.
To respond to the message, pl
hvanhovell opened a new pull request, #36751:
URL: https://github.com/apache/spark/pull/36751
### What changes were proposed in this pull request?
This PR removes the unlocking of write locks on task end from the
`BlockInfoManager`.
### Why are the changes needed?
The `BlockInfo
dongjoon-hyun commented on PR #36750:
URL: https://github.com/apache/spark/pull/36750#issuecomment-1145110352
Thank you for pinging me, @sunchao
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to t
1 - 100 of 146 matches
Mail list logo