wangyum commented on code in PR #39512:
URL: https://github.com/apache/spark/pull/39512#discussion_r1067797166
##
sql/core/src/test/scala/org/apache/spark/sql/CachedTableSuite.scala:
##
@@ -541,7 +541,8 @@ class CachedTableSuite extends QueryTest with SQLTestUtils
viirya commented on PR #39508:
URL: https://github.com/apache/spark/pull/39508#issuecomment-1379904716
There are some conflicts.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
mridulm commented on PR #37638:
URL: https://github.com/apache/spark/pull/37638#issuecomment-1379889561
Merged to master.
Thanks for working on this @rmcyang !
Thanks for the reviews @zhouyejoe, @otterc :-)
--
This is an automated message from the Apache Git Service.
To respond to
asfgit closed pull request #37638: [SPARK-33573][SHUFFLE][YARN] Shuffle server
side metrics for Push-based shuffle
URL: https://github.com/apache/spark/pull/37638
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
techaddict commented on PR #39451:
URL: https://github.com/apache/spark/pull/39451#issuecomment-1379881170
Cc @amaliujia @HyukjinKwon @zhengruifeng can you review this PR ? I think
its a starightforward change.
--
This is an automated message from the Apache Git Service.
To respond to
HyukjinKwon closed pull request #39522: [SPARK-41998][CONNECT][TESTS] Reuse
pyspark.sql.tests.test_readwriter test cases
URL: https://github.com/apache/spark/pull/39522
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
HyukjinKwon closed pull request #39521:
[SPARK-41887][CONNECT][TESTS][FOLLOW-UP] Enable test_extended_hint_types test
case
URL: https://github.com/apache/spark/pull/39521
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
HyukjinKwon commented on PR #39522:
URL: https://github.com/apache/spark/pull/39522#issuecomment-1379874535
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #39521:
URL: https://github.com/apache/spark/pull/39521#issuecomment-1379874451
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
LuciferYang commented on PR #39435:
URL: https://github.com/apache/spark/pull/39435#issuecomment-1379870165
Manually checked the core module with this pr as follows:
```
gh pr checkout 39435
export LIVE_UI_LOCAL_STORE_DIR=/Users/yangjie01/SourceCode/spark-ui
build/mvn clean
kuwii commented on PR #39190:
URL: https://github.com/apache/spark/pull/39190#issuecomment-1379868894
> Hi. this impacts Jobs API so this is a user facing change right?
@VindhyaG Thanks for the comment. I've updated the PR description.
--
This is an automated message from the
LuciferYang commented on PR #39487:
URL: https://github.com/apache/spark/pull/39487#issuecomment-1379868522
Thanks @gengliangwang
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
gengliangwang closed pull request #39487: [SPARK-41968][CORE][SQL] Refactor
`ProtobufSerDe` to `ProtobufSerDe[T]`
URL: https://github.com/apache/spark/pull/39487
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
gengliangwang commented on PR #39487:
URL: https://github.com/apache/spark/pull/39487#issuecomment-1379867822
@LuciferYang Thanks for the work.
Merging to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067741652
##
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterEndpoint.scala:
##
@@ -321,6 +321,12 @@ class BlockManagerMasterEndpoint(
}
private def
gengliangwang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067741516
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
HyukjinKwon opened a new pull request, #39529:
URL: https://github.com/apache/spark/pull/39529
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_types` tests in Spark Connect
that pass for now.
### Why are the changes needed?
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067733179
##
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterEndpoint.scala:
##
@@ -321,6 +321,12 @@ class BlockManagerMasterEndpoint(
}
private def
HyukjinKwon opened a new pull request, #39528:
URL: https://github.com/apache/spark/pull/39528
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_column` tests in Spark
Connect that pass for now.
### Why are the changes needed?
HyukjinKwon opened a new pull request, #39527:
URL: https://github.com/apache/spark/pull/39527
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_serde` tests in Spark Connect
that pass for now.
### Why are the changes needed?
HyukjinKwon opened a new pull request, #39526:
URL: https://github.com/apache/spark/pull/39526
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_datasources` tests in Spark
Connect that pass for now.
### Why are the changes
HyukjinKwon opened a new pull request, #39525:
URL: https://github.com/apache/spark/pull/39525
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_group` tests in Spark Connect
that pass for now.
### Why are the changes needed?
wankunde commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067694549
##
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterEndpoint.scala:
##
@@ -321,6 +321,12 @@ class BlockManagerMasterEndpoint(
}
private def
cloud-fan commented on code in PR #39037:
URL: https://github.com/apache/spark/pull/39037#discussion_r1067691706
##
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:
##
@@ -2693,6 +2694,21 @@ class AdaptiveQueryExecSuite
HeartSaVioR closed pull request #39520: [SPARK-41996][SQL][SS] Fix kafka test
to verify lost partitions to account for slow Kafka operations
URL: https://github.com/apache/spark/pull/39520
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
HeartSaVioR commented on PR #39520:
URL: https://github.com/apache/spark/pull/39520#issuecomment-1379775649
Thanks! Merging to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
itholic commented on PR #39496:
URL: https://github.com/apache/spark/pull/39496#issuecomment-1379771335
@srielau Yeah, that makes sense. Just created [OSS
ticket](https://issues.apache.org/jira/browse/SPARK-42004) to handle this.
Thanks!!
--
This is an automated message from the Apache
cloud-fan commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067668533
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/AgnosticEncoder.scala:
##
@@ -46,35 +46,42 @@ object AgnosticEncoders {
override val
cloud-fan commented on code in PR #39523:
URL: https://github.com/apache/spark/pull/39523#discussion_r1067665831
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveGroupByAll.scala:
##
@@ -47,25 +47,40 @@ object ResolveGroupByAll extends
gengliangwang commented on code in PR #39509:
URL: https://github.com/apache/spark/pull/39509#discussion_r1067663382
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveGroupByAll.scala:
##
@@ -93,8 +93,9 @@ object ResolveGroupByAll extends
panbingkun opened a new pull request, #39524:
URL: https://github.com/apache/spark/pull/39524
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch
gengliangwang opened a new pull request, #39523:
URL: https://github.com/apache/spark/pull/39523
### What changes were proposed in this pull request?
Reduce duplicate code in ResolveGroupByAll by moving the group by expression
inference into a new method.
### Why are
rangareddy commented on PR #39515:
URL: https://github.com/apache/spark/pull/39515#issuecomment-1379755267
Hi @maxgekk
Could you please review this PR?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
cloud-fan commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067657414
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala:
##
@@ -377,27 +408,96 @@ object ScalaReflection extends ScalaReflection {
cloud-fan commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067657104
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala:
##
@@ -155,11 +169,19 @@ object ScalaReflection extends ScalaReflection {
cloud-fan commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067656569
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/encoders/RowEncoderSuite.scala:
##
@@ -125,7 +125,7 @@ class RowEncoderSuite extends
cloud-fan commented on code in PR #39512:
URL: https://github.com/apache/spark/pull/39512#discussion_r1067653257
##
sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/EnsureRequirements.scala:
##
@@ -76,13 +76,17 @@ case class EnsureRequirements(
case _ =>
cloud-fan commented on code in PR #39512:
URL: https://github.com/apache/spark/pull/39512#discussion_r1067652958
##
sql/core/src/test/scala/org/apache/spark/sql/CachedTableSuite.scala:
##
@@ -541,7 +541,8 @@ class CachedTableSuite extends QueryTest with SQLTestUtils
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067649883
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
HyukjinKwon opened a new pull request, #39522:
URL: https://github.com/apache/spark/pull/39522
### What changes were proposed in this pull request?
This PR reuses PySpark `pyspark.sql.tests.test_readwriter` tests in Spark
Connect that pass for now.
### Why are the changes
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067650529
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067650529
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067647694
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067649883
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067649883
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067650529
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067649883
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
LuciferYang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067647694
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
itholic commented on PR #39505:
URL: https://github.com/apache/spark/pull/39505#issuecomment-1379739624
Let me fix the related tests as well while we're here.
Will update the PR description.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
rmcyang commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067643931
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
mridulm commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067641509
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
cloud-fan commented on code in PR #39479:
URL: https://github.com/apache/spark/pull/39479#discussion_r1067639729
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/OptimizeOneRowRelationSubquerySuite.scala:
##
@@ -177,4 +177,27 @@ class
HyukjinKwon opened a new pull request, #39521:
URL: https://github.com/apache/spark/pull/39521
### What changes were proposed in this pull request?
This PR is a followup of https://github.com/apache/spark/pull/39491 that
enables `test_extended_hint_types` test back by avoiding `_jdf`
cloud-fan commented on code in PR #39479:
URL: https://github.com/apache/spark/pull/39479#discussion_r1067638085
##
sql/core/src/test/resources/sql-tests/inputs/join-lateral.sql:
##
@@ -177,6 +177,25 @@ SELECT * FROM t3 JOIN LATERAL (SELECT EXPLODE_OUTER(c2));
SELECT * FROM t3
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067637314
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
cloud-fan commented on code in PR #39479:
URL: https://github.com/apache/spark/pull/39479#discussion_r1067637188
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/OptimizeOneRowRelationSubquerySuite.scala:
##
@@ -177,4 +177,27 @@ class
anishshri-db commented on PR #39520:
URL: https://github.com/apache/spark/pull/39520#issuecomment-1379722527
@HeartSaVioR - please take a look. Thx
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
cloud-fan closed pull request #39509: [SPARK-41635][SQL] Fix group by all error
reporting
URL: https://github.com/apache/spark/pull/39509
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
cloud-fan commented on PR #39509:
URL: https://github.com/apache/spark/pull/39509#issuecomment-1379720548
thanks for review, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
cloud-fan commented on code in PR #39509:
URL: https://github.com/apache/spark/pull/39509#discussion_r1067633765
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveGroupByAll.scala:
##
@@ -93,8 +93,9 @@ object ResolveGroupByAll extends
anishshri-db opened a new pull request, #39520:
URL: https://github.com/apache/spark/pull/39520
### What changes were proposed in this pull request?
Fix kafka test to verify lost partitions to account for slow Kafka operations
Basically its possible that kafka operations around
gengliangwang commented on PR #39435:
URL: https://github.com/apache/spark/pull/39435#issuecomment-1379718544
cc @LuciferYang @panbingkun @techaddict as well.
I tried with hard coding a rocksdb backend path before commit
HyukjinKwon closed pull request #39500: [SPARK-41980][CONNECT][TESTS] Enable
test_functions_broadcast in functions parity test
URL: https://github.com/apache/spark/pull/39500
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
HyukjinKwon commented on PR #39500:
URL: https://github.com/apache/spark/pull/39500#issuecomment-1379696692
All related tests passed.
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
eric-maynard commented on PR #39519:
URL: https://github.com/apache/spark/pull/39519#issuecomment-1379685811
I see, where should we change to add to the other APIs? It looks like
SchemaOfJson is used under the hood by the Scala API.
--
This is an automated message from the Apache Git
erenavsarogullari commented on PR #39037:
URL: https://github.com/apache/spark/pull/39037#issuecomment-1379684356
Thanks @ulysses-you for this fix.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
wangyum commented on code in PR #39512:
URL: https://github.com/apache/spark/pull/39512#discussion_r1067605737
##
sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/EnsureRequirements.scala:
##
@@ -76,13 +76,17 @@ case class EnsureRequirements(
case _ =>
HyukjinKwon commented on PR #39499:
URL: https://github.com/apache/spark/pull/39499#issuecomment-1379677495
Yeah .. so technically it should only take `int`s as that's what the method
wants. There are a lot of cases like that in PySpark (e.g.,
`DataFrameReader.jdbc`), and a lot of
zhengruifeng commented on PR #39499:
URL: https://github.com/apache/spark/pull/39499#issuecomment-1379668005
@dongjoon-hyun
good question. `range` in Connect has the same signature as the PySpark's
one, which should only accept intergers.
But PySpark's implementation doesn't
zhouyejoe commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067596021
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
HyukjinKwon commented on PR #39518:
URL: https://github.com/apache/spark/pull/39518#issuecomment-1379655842
cc @rednaxelafx @cloud-fan FYI
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon commented on PR #39516:
URL: https://github.com/apache/spark/pull/39516#issuecomment-1379647638
Thanks for the fix @soxofaan. I think it was a mistake, cc @itholic
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
HyukjinKwon commented on code in PR #39516:
URL: https://github.com/apache/spark/pull/39516#discussion_r1067585079
##
python/pyspark/pandas/__init__.py:
##
@@ -49,7 +49,8 @@
):
import logging
-logging.warning(
+logger = logging.getLogger(__name__)
+
HyukjinKwon commented on PR #39519:
URL: https://github.com/apache/spark/pull/39519#issuecomment-1379641861
Would be great if we can add it to Scala, Python, and R API. I don't mind
doing that in a separate PR.
--
This is an automated message from the Apache Git Service.
To respond to
HyukjinKwon commented on code in PR #39519:
URL: https://github.com/apache/spark/pull/39519#discussion_r1067581392
##
sql/core/src/test/scala/org/apache/spark/sql/JsonFunctionsSuite.scala:
##
@@ -583,6 +583,17 @@ class JsonFunctionsSuite extends QueryTest with
HyukjinKwon commented on code in PR #39519:
URL: https://github.com/apache/spark/pull/39519#discussion_r1067580794
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala:
##
@@ -793,27 +796,17 @@ case class SchemaOfJson(
@transient
rithwik-db commented on code in PR #39267:
URL: https://github.com/apache/spark/pull/39267#discussion_r1067576354
##
python/pyspark/ml/torch/distributor.py:
##
@@ -407,13 +418,6 @@ def _run_local_training(
try:
if self.use_gpu:
gpus_owned
HyukjinKwon closed pull request #39188: [SPARK-41591][PYTHON][ML] Training
PyTorch Files on Single Node Multi GPU
URL: https://github.com/apache/spark/pull/39188
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #39188:
URL: https://github.com/apache/spark/pull/39188#issuecomment-1379623516
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
gengliangwang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067558519
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
gengliangwang commented on code in PR #39487:
URL: https://github.com/apache/spark/pull/39487#discussion_r1067558195
##
core/src/main/scala/org/apache/spark/status/protobuf/KVStoreProtobufSerializer.scala:
##
@@ -40,10 +41,16 @@ private[spark] class KVStoreProtobufSerializer
eric-maynard opened a new pull request, #39519:
URL: https://github.com/apache/spark/pull/39519
### What changes were proposed in this pull request?
Presently, only foldable expressions can be passed in to schema_of_json,
e.g. `SCHEMA_OF_JSON(CONCAT('', ''))`.
With this change, we
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067554329
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/NoOpMergedShuffleFileManager.java:
##
@@ -84,4 +85,9 @@ public MergedBlockMeta
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067551485
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067551485
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067551485
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067551485
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067551485
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalBlockStoreClient.java:
##
@@ -256,6 +256,22 @@ public void onFailure(Throwable e) {
mridulm commented on code in PR #37922:
URL: https://github.com/apache/spark/pull/37922#discussion_r1067548067
##
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterEndpoint.scala:
##
@@ -321,6 +321,12 @@ class BlockManagerMasterEndpoint(
}
private def
allisonwang-db commented on PR #39479:
URL: https://github.com/apache/spark/pull/39479#issuecomment-1379590363
cc @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
rmcyang commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067536500
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
rmcyang commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067534717
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
srowen closed pull request #39511: [SPARK-41047][SQL] Improve docs for round
URL: https://github.com/apache/spark/pull/39511
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
panbingkun commented on PR #39511:
URL: https://github.com/apache/spark/pull/39511#issuecomment-1379568848
Done
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
mridulm commented on code in PR #37638:
URL: https://github.com/apache/spark/pull/37638#discussion_r1067505986
##
docs/monitoring.md:
##
@@ -1421,6 +1421,21 @@ Note: applies to the shuffle service
- shuffle-server.usedDirectMemory
- shuffle-server.usedHeapMemory
+Note:
akpatnam25 commented on code in PR #38959:
URL: https://github.com/apache/spark/pull/38959#discussion_r1067498072
##
core/src/main/scala/org/apache/spark/shuffle/ShuffleBlockPusher.scala:
##
@@ -251,6 +251,10 @@ private[spark] class ShuffleBlockPusher(conf: SparkConf)
extends
bersprockets opened a new pull request, #39518:
URL: https://github.com/apache/spark/pull/39518
### What changes were proposed in this pull request?
Change `CheckOverflowInTableInsert` to accept a `Cast` wrapped by an
`ExpressionProxy` as a child.
### Why are the changes
hvanhovell commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067435036
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala:
##
@@ -377,27 +408,96 @@ object ScalaReflection extends ScalaReflection {
hvanhovell commented on PR #39517:
URL: https://github.com/apache/spark/pull/39517#issuecomment-1379453290
A note for the reviewers. I know that Catalyst tests pass. I have not run
other tests, so there might still be a few things to iron out.
--
This is an automated message from the
hvanhovell commented on code in PR #39517:
URL: https://github.com/apache/spark/pull/39517#discussion_r1067433716
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala:
##
@@ -306,7 +330,7 @@ object ScalaReflection extends ScalaReflection {
*
1 - 100 of 185 matches
Mail list logo