grundprinzip commented on code in PR #39016:
URL: https://github.com/apache/spark/pull/39016#discussion_r1045017515
##
python/pyspark/sql/connect/column.py:
##
@@ -180,7 +180,12 @@ def to_plan(self, session: "SparkConnectClient") ->
"proto.Expression":
elif
MaxGekk commented on PR #38972:
URL: https://github.com/apache/spark/pull/38972#issuecomment-1345173649
@panbingkun Could you fix the test failure, seems it is related to your
changes:
```
sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException:
"[COLUMN_NOT_FOUND] The
MaxGekk commented on code in PR #38998:
URL: https://github.com/apache/spark/pull/38998#discussion_r1045013373
##
core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala:
##
@@ -147,6 +147,18 @@ class SparkThrowableSuite extends SparkFunSuite {
MaxGekk closed pull request #38954: [SPARK-41417][CORE][SQL] Rename
`_LEGACY_ERROR_TEMP_0019` to `INVALID_TYPED_LITERAL`
URL: https://github.com/apache/spark/pull/38954
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
MaxGekk commented on PR #38954:
URL: https://github.com/apache/spark/pull/38954#issuecomment-1345157464
+1, LGTM. Merging to master.
Thank you, @LuciferYang.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
AmplabJenkins commented on PR #39012:
URL: https://github.com/apache/spark/pull/39012#issuecomment-1345155848
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
AmplabJenkins commented on PR #39013:
URL: https://github.com/apache/spark/pull/39013#issuecomment-1345155835
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on PR #39005:
URL: https://github.com/apache/spark/pull/39005#issuecomment-1345151502
All tests passed. Merged to master. Thank you, @panbingkun and all.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
dongjoon-hyun closed pull request #39005: [SPARK-41467][BUILD] Upgrade
httpclient from 4.5.13 to 4.5.14
URL: https://github.com/apache/spark/pull/39005
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun commented on PR #39015:
URL: https://github.com/apache/spark/pull/39015#issuecomment-1345151056
Thank you, @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
zhengruifeng opened a new pull request, #39016:
URL: https://github.com/apache/spark/pull/39016
### What changes were proposed in this pull request?
check the bounds of integer and choose the correct datatypes
### Why are the changes needed?
to match pyspark:
```
dongjoon-hyun commented on PR #39015:
URL: https://github.com/apache/spark/pull/39015#issuecomment-1345146063
BTW, if you don't mind, I'd like to land this to all live branches to reduce
the community resources. WDTY, @HyukjinKwon and @gengliangwang ?
--
This is an automated message from
dongjoon-hyun closed pull request #38991: [SPARK-41457][PYTHON][TESTS] Refactor
type annotations and dependency checks in tests
URL: https://github.com/apache/spark/pull/38991
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
dongjoon-hyun commented on PR #38991:
URL: https://github.com/apache/spark/pull/38991#issuecomment-1345145538
All python and linter tests passed. Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun closed pull request #39012: [SPARK-41475][CONNECT] Fix lint-scala
command error and typo
URL: https://github.com/apache/spark/pull/39012
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun commented on code in PR #39015:
URL: https://github.com/apache/spark/pull/39015#discussion_r1044989412
##
dev/sparktestsupport/utils.py:
##
@@ -34,19 +34,22 @@ def determine_modules_for_files(filenames):
Given a list of filenames, return the set of modules
dongjoon-hyun commented on code in PR #39015:
URL: https://github.com/apache/spark/pull/39015#discussion_r1044988801
##
dev/sparktestsupport/utils.py:
##
@@ -84,7 +84,7 @@ def identify_changed_files_from_git_commits(patch_sha,
target_branch=None, targe
["git", "diff",
shuyouZZ commented on code in PR #38983:
URL: https://github.com/apache/spark/pull/38983#discussion_r1044983924
##
core/src/test/scala/org/apache/spark/deploy/history/FsHistoryProviderSuite.scala:
##
@@ -1705,6 +1705,47 @@ abstract class FsHistoryProviderSuite extends
viirya commented on PR #38993:
URL: https://github.com/apache/spark/pull/38993#issuecomment-1345117121
> BTW, seems you change the PR description template by mistake. Can you
restore the template?
Can you restore to the standard template?
--
This is an automated message from the
viirya commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044982387
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
idealspark commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044970950
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
idealspark commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044969362
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
dengziming commented on code in PR #38984:
URL: https://github.com/apache/spark/pull/38984#discussion_r1044965012
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -305,7 +305,11 @@ class
idealspark commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044964246
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
panbingkun commented on code in PR #38972:
URL: https://github.com/apache/spark/pull/38972#discussion_r1044960581
##
core/src/main/resources/error/error-classes.json:
##
@@ -109,6 +109,11 @@
"The column already exists. Consider to choose another name
or rename the
HyukjinKwon commented on code in PR #38984:
URL: https://github.com/apache/spark/pull/38984#discussion_r1044958569
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -305,7 +305,11 @@ class
HyukjinKwon commented on code in PR #38984:
URL: https://github.com/apache/spark/pull/38984#discussion_r1044958569
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -305,7 +305,11 @@ class
Ngone51 commented on code in PR #38876:
URL: https://github.com/apache/spark/pull/38876#discussion_r1044955782
##
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##
@@ -637,9 +637,11 @@ private[spark] class BlockManager(
def reregister(): Unit = {
//
HyukjinKwon commented on code in PR #38991:
URL: https://github.com/apache/spark/pull/38991#discussion_r1044955147
##
dev/lint-python:
##
@@ -104,7 +104,7 @@ function mypy_data_test {
-c dev/pyproject.toml \
--rootdir python \
--mypy-only-local-stub \
-
HyukjinKwon commented on code in PR #39015:
URL: https://github.com/apache/spark/pull/39015#discussion_r1044952894
##
dev/sparktestsupport/utils.py:
##
@@ -84,7 +84,7 @@ def identify_changed_files_from_git_commits(patch_sha,
target_branch=None, targe
["git", "diff",
dongjoon-hyun commented on PR #39014:
URL: https://github.com/apache/spark/pull/39014#issuecomment-1344990963
All tests passed. Merged to master for Apache Spark 3.4.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
dongjoon-hyun closed pull request #39014: [SPARK-41474][PROTOBUF][BUILD]
Exclude `proto` files from `spark-protobuf` jar
URL: https://github.com/apache/spark/pull/39014
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun commented on PR #39015:
URL: https://github.com/apache/spark/pull/39015#issuecomment-1344988102
Thank you so much, @gengliangwang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
dongjoon-hyun commented on PR #39015:
URL: https://github.com/apache/spark/pull/39015#issuecomment-1344987151
Could you review this too please, @gengliangwang ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
SandishKumarHN commented on code in PR #38922:
URL: https://github.com/apache/spark/pull/38922#discussion_r1044857157
##
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##
@@ -38,6 +38,12 @@ private[sql] class ProtobufOptions(
dongjoon-hyun opened a new pull request, #39015:
URL: https://github.com/apache/spark/pull/39015
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
dongjoon-hyun commented on PR #39014:
URL: https://github.com/apache/spark/pull/39014#issuecomment-1344983516
Thank you, @gengliangwang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun commented on PR #39014:
URL: https://github.com/apache/spark/pull/39014#issuecomment-1344977841
Thank you, @SandishKumarHN .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
SandishKumarHN commented on PR #39014:
URL: https://github.com/apache/spark/pull/39014#issuecomment-1344977544
@dongjoon-hyun LGTM, thanks for the PR.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on PR #39014:
URL: https://github.com/apache/spark/pull/39014#issuecomment-1344964855
Could you review this, @SandishKumarHN and @gengliangwang ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
zhengruifeng commented on PR #38946:
URL: https://github.com/apache/spark/pull/38946#issuecomment-1344961425
merged into master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
zhengruifeng closed pull request #38946: [SPARK-41414][CONNECT][PYTHON]
Implement date/timestamp functions
URL: https://github.com/apache/spark/pull/38946
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
srowen commented on code in PR #38996:
URL: https://github.com/apache/spark/pull/38996#discussion_r1044938891
##
docs/mllib-isotonic-regression.md:
##
@@ -43,7 +43,17 @@ best fitting the original data points.
which uses an approach to
[parallelizing isotonic
dengziming commented on code in PR #39012:
URL: https://github.com/apache/spark/pull/39012#discussion_r1044935825
##
dev/lint-scala:
##
@@ -29,14 +29,14 @@ ERRORS=$(./build/mvn \
-Dscalafmt.skip=false \
-Dscalafmt.validateOnly=true \
-Dscalafmt.changedOnly=false
dengziming commented on code in PR #39012:
URL: https://github.com/apache/spark/pull/39012#discussion_r1044932932
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -322,8 +322,7 @@ class SparkConnectPlanner(session:
dongjoon-hyun commented on PR #38994:
URL: https://github.com/apache/spark/pull/38994#issuecomment-1344949254
No problem at all. If you checked locally, that's more than enough. :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
dongjoon-hyun opened a new pull request, #39014:
URL: https://github.com/apache/spark/pull/39014
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
HyukjinKwon commented on PR #38994:
URL: https://github.com/apache/spark/pull/38994#issuecomment-1344946168
oh yeah. the tests passed but the linter failed. I just removed one ignore
in linter that's not used, and I checked it locally.
But sorry I should have waited for the test
dongjoon-hyun commented on PR #38994:
URL: https://github.com/apache/spark/pull/38994#issuecomment-1344945736
Ur, does this pass CI?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #38994: [SPARK-41329][CONNECT] Resolve circular
imports in Spark Connect
URL: https://github.com/apache/spark/pull/38994
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
HyukjinKwon commented on PR #38994:
URL: https://github.com/apache/spark/pull/38994#issuecomment-1344945421
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on code in PR #38994:
URL: https://github.com/apache/spark/pull/38994#discussion_r1044928308
##
python/pyspark/sql/connect/column.py:
##
@@ -706,28 +705,30 @@ def substr(self, startPos: Union[int, "Column"], length:
Union[int, "Column"]) -
gengliangwang commented on code in PR #39000:
URL: https://github.com/apache/spark/pull/39000#discussion_r1044927630
##
core/pom.xml:
##
@@ -713,6 +693,71 @@
+
+ default-protoc
+
+
+ !skipDefaultProtoc
+
+
+
github-actions[bot] closed pull request #36921: [SPARK-39481][SQL] Do not push
down complex filter condition
URL: https://github.com/apache/spark/pull/36921
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
dongjoon-hyun commented on PR #38985:
URL: https://github.com/apache/spark/pull/38985#issuecomment-1344930220
Thank you for closing, @pan3793 .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
gengliangwang closed pull request #38988: [SPARK-41456][SQL] Improve the
performance of try_cast
URL: https://github.com/apache/spark/pull/38988
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
gengliangwang commented on PR #38988:
URL: https://github.com/apache/spark/pull/38988#issuecomment-1344894973
@cloud-fan @HyukjinKwon @LuciferYang Thanks for the review.
Merging to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please
xinrong-meng commented on PR #39009:
URL: https://github.com/apache/spark/pull/39009#issuecomment-1344864352
Merged to master, thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
xinrong-meng closed pull request #39009:
[SPARK-41225][CONNECT][PYTHON][FOLLOW-UP] Disable unsupported functions.
URL: https://github.com/apache/spark/pull/39009
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
xinrong-meng opened a new pull request, #39013:
URL: https://github.com/apache/spark/pull/39013
### What changes were proposed in this pull request?
Implement the rest of string/binary functions. The first commit is
https://github.com/apache/spark/pull/38921.
### Why are the
dongjoon-hyun closed pull request #38982: [SPARK-41376][CORE][3.2] Correct the
Netty preferDirectBufs check logic on executor start
URL: https://github.com/apache/spark/pull/38982
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
otterc commented on code in PR #36165:
URL: https://github.com/apache/spark/pull/36165#discussion_r1044862699
##
core/src/test/resources/HistoryServerExpectations/one_stage_json_with_partitionId_expectation.json:
##
@@ -26,13 +26,23 @@
"outputBytes" : 0,
"outputRecords" :
SandishKumarHN commented on code in PR #38922:
URL: https://github.com/apache/spark/pull/38922#discussion_r1044857157
##
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##
@@ -38,6 +38,12 @@ private[sql] class ProtobufOptions(
mridulm commented on code in PR #36165:
URL: https://github.com/apache/spark/pull/36165#discussion_r1044823829
##
core/src/main/scala/org/apache/spark/status/api/v1/api.scala:
##
@@ -302,7 +312,9 @@ class StageData private[spark](
@JsonDeserialize(using =
anchovYu commented on code in PR #38776:
URL: https://github.com/apache/spark/pull/38776#discussion_r1044807443
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/namedExpressions.scala:
##
@@ -424,8 +424,51 @@ case class OuterReference(e: NamedExpression)
WolverineJiang commented on PR #39000:
URL: https://github.com/apache/spark/pull/39000#issuecomment-1344724455
The pom of the core module has active profiles, and activeByDefault does not
take effect. Therefore, property is used instead.
--
This is an automated message from the Apache
amaliujia commented on PR #39004:
URL: https://github.com/apache/spark/pull/39004#issuecomment-1344680268
late LGTM thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
anchovYu commented on code in PR #38776:
URL: https://github.com/apache/spark/pull/38776#discussion_r1044744657
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala:
##
@@ -638,6 +638,14 @@ trait CheckAnalysis extends PredicateHelper with
viirya commented on PR #38993:
URL: https://github.com/apache/spark/pull/38993#issuecomment-1344649595
BTW, seems you change the PR description template by mistake. Can you
restore the template?
--
This is an automated message from the Apache Git Service.
To respond to the message,
viirya commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044740898
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
viirya commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044738513
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
viirya commented on code in PR #38993:
URL: https://github.com/apache/spark/pull/38993#discussion_r1044737028
##
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/LogDivertAppender.java:
##
@@ -276,12 +276,19 @@ private static StringLayout
dongjoon-hyun commented on PR #38982:
URL: https://github.com/apache/spark/pull/38982#issuecomment-1344618354
Merged to branch-3.2 too. Thank you, @pan3793 and @Yikun .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
vinodkc commented on PR #38608:
URL: https://github.com/apache/spark/pull/38608#issuecomment-1344618388
@gengliangwang, can you please review it?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
amaliujia commented on code in PR #38984:
URL: https://github.com/apache/spark/pull/38984#discussion_r1044705096
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -305,7 +305,11 @@ class SparkConnectPlanner(session:
dongjoon-hyun commented on PR #38991:
URL: https://github.com/apache/spark/pull/38991#issuecomment-1344612390
Could you check the linter failure?
```
starting mypy data test...
annotations failed data checks:
= test session starts
xinrong-meng commented on code in PR #38946:
URL: https://github.com/apache/spark/pull/38946#discussion_r1044694589
##
python/pyspark/sql/tests/connect/test_connect_function.py:
##
@@ -645,6 +645,153 @@ def test_string_functions(self):
sdf.select(SF.encode("c",
gengliangwang commented on PR #38999:
URL: https://github.com/apache/spark/pull/38999#issuecomment-1344591174
@pan3793 Thanks for fixing it!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
otterc commented on code in PR #36165:
URL: https://github.com/apache/spark/pull/36165#discussion_r1044679570
##
core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala:
##
@@ -726,6 +736,61 @@ final class ShuffleBlockFetcherIterator(
}
}
+ //
steveloughran commented on PR #38974:
URL: https://github.com/apache/spark/pull/38974#issuecomment-1344581573
yeah, not going to happen for a while; 3.3.5 RC0 coming soon though; just
trying to wrap up an abfs prefetch bug
--
This is an automated message from the Apache Git Service.
To
LuciferYang commented on PR #38974:
URL: https://github.com/apache/spark/pull/38974#issuecomment-1344577731
fine to me, close first ~
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
LuciferYang closed pull request #38974: [SPARK-41392][BUILD] Make maven build
Spark master with Hadoop 3.4.0-SNAPSHOT successful
URL: https://github.com/apache/spark/pull/38974
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
MaxGekk commented on PR #38997:
URL: https://github.com/apache/spark/pull/38997#issuecomment-1344574665
+1, LGTM. Merging to master.
Thank you, @gengliangwang and @cloud-fan for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
dongjoon-hyun commented on code in PR #39012:
URL: https://github.com/apache/spark/pull/39012#discussion_r1044671065
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -322,8 +322,7 @@ class
dongjoon-hyun commented on code in PR #39012:
URL: https://github.com/apache/spark/pull/39012#discussion_r1044670574
##
dev/lint-scala:
##
@@ -29,14 +29,14 @@ ERRORS=$(./build/mvn \
-Dscalafmt.skip=false \
-Dscalafmt.validateOnly=true \
dongjoon-hyun commented on PR #39004:
URL: https://github.com/apache/spark/pull/39004#issuecomment-1344576249
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun closed pull request #39004: [SPARK-41466][BUILD] Change Scala
Style configuration to catch AnyFunSuite instead of FunSuite
URL: https://github.com/apache/spark/pull/39004
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
MaxGekk closed pull request #38997: [SPARK-41462][SQL] Date and timestamp type
can up cast to TimestampNTZ
URL: https://github.com/apache/spark/pull/38997
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
rednaxelafx commented on PR #38923:
URL: https://github.com/apache/spark/pull/38923#issuecomment-1344558867
Post-hoc review: LGTM, this is a good catch.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
warrenzhu25 commented on PR #39011:
URL: https://github.com/apache/spark/pull/39011#issuecomment-1344553081
> cc @warrenzhu25 too
It's really the change I want. Great work.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
wineternity commented on code in PR #38702:
URL: https://github.com/apache/spark/pull/38702#discussion_r1044596644
##
core/src/main/scala/org/apache/spark/status/AppStatusListener.scala:
##
@@ -674,22 +674,30 @@ private[spark] class AppStatusListener(
delta
}.orNull
Ngone51 commented on code in PR #38702:
URL: https://github.com/apache/spark/pull/38702#discussion_r1044581470
##
core/src/main/scala/org/apache/spark/status/AppStatusListener.scala:
##
@@ -674,22 +674,30 @@ private[spark] class AppStatusListener(
delta
}.orNull
-
dengziming opened a new pull request, #39012:
URL: https://github.com/apache/spark/pull/39012
### What changes were proposed in this pull request?
We separate connect into server and common, but failed to update the
`lint-scala` tools.
fix a typo: fase -> false
format the code.
dongjoon-hyun commented on PR #38999:
URL: https://github.com/apache/spark/pull/38999#issuecomment-1344459575
Merged to master. Thank you, @pan3793 and all!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
dongjoon-hyun closed pull request #38999: [SPARK-41450][BUILD] Fix shading in
`core` module
URL: https://github.com/apache/spark/pull/38999
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
Ngone51 commented on PR #39011:
URL: https://github.com/apache/spark/pull/39011#issuecomment-1344456134
Mark as WIP first regarding the compilation error and missing ut. Any
feedback is still welcome.
--
This is an automated message from the Apache Git Service.
To respond to the message,
vinodkc commented on PR #38146:
URL: https://github.com/apache/spark/pull/38146#issuecomment-1344455185
@gengliangwang, Review comments are addressed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
Ngone51 commented on code in PR #39011:
URL: https://github.com/apache/spark/pull/39011#discussion_r1044572392
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -1046,17 +1046,46 @@ private[spark] class TaskSetManager(
/** Called by
Ngone51 commented on PR #39011:
URL: https://github.com/apache/spark/pull/39011#issuecomment-137148
cc @warrenzhu25 too
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
Ngone51 opened a new pull request, #39011:
URL: https://github.com/apache/spark/pull/39011
### What changes were proposed in this pull request?
This PR proposes to avoid rerunning the finished shuffle map task in
`TaskSetManager.executorLost()` if the executor lost is
1 - 100 of 233 matches
Mail list logo