yaooqinn commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1695053749
thanks @srowen @gengliangwang @sarutak @HyukjinKwon @dongjoon-hyun, merged
to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
yaooqinn closed pull request #42674: [SPARK-44960][UI] Unescape and consist
error summary across UI pages
URL: https://github.com/apache/spark/pull/42674
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
maheshk114 commented on code in PR #41860:
URL: https://github.com/apache/spark/pull/41860#discussion_r1306937895
##
sql/core/src/test/scala/org/apache/spark/sql/InjectRuntimeFilterSuite.scala:
##
@@ -644,4 +644,76 @@ class InjectRuntimeFilterSuite extends QueryTest with
maheshk114 commented on code in PR #41860:
URL: https://github.com/apache/spark/pull/41860#discussion_r1306937806
##
sql/core/src/test/scala/org/apache/spark/sql/InjectRuntimeFilterSuite.scala:
##
@@ -644,4 +644,76 @@ class InjectRuntimeFilterSuite extends QueryTest with
Hisoka-X commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306927209
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala:
##
@@ -133,8 +138,13 @@ case class
Hisoka-X commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306925328
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryEval.scala:
##
@@ -236,3 +236,35 @@ case class TryToBinary(
override protected def
cloud-fan commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306915491
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryEval.scala:
##
@@ -236,3 +236,35 @@ case class TryToBinary(
override protected def
cloud-fan commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306914923
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala:
##
@@ -133,8 +138,13 @@ case class
LuciferYang commented on PR #42518:
URL: https://github.com/apache/spark/pull/42518#issuecomment-1695020180
The test failure is not related to the current pr, but the scala format need
to be fixed.
```
Run ./dev/lint-scala
[info] [launcher] getting org.scala-sbt sbt 1.9.3 (this
LuciferYang closed pull request #42696: Test Yarn module with
`-Dtest.exclude.tags=org.apache.spark.tags.ExtendedLevelDBTest`
URL: https://github.com/apache/spark/pull/42696
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
dongjoon-hyun commented on PR #41673:
URL: https://github.com/apache/spark/pull/41673#issuecomment-1695005013
Thank you, @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
LuciferYang commented on PR #41673:
URL: https://github.com/apache/spark/pull/41673#issuecomment-1695004548
Thanks @dongjoon-hyun ~
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #41673:
URL: https://github.com/apache/spark/pull/41673#issuecomment-1694995407
BTW, I updated the JIRA title because
https://github.com/apache/spark/pull/42696 proves that this is not MacOS or
Apple-Silicon-related issues, @LuciferYang .
--
This is an
dongjoon-hyun closed pull request #41673: [SPARK-44091][YARN][TESTS] Introduce
`withResourceTypes` to `ResourceRequestTestHelper` to restore `resourceTypes`
as default value after testing
URL: https://github.com/apache/spark/pull/41673
--
This is an automated message from the Apache Git
HyukjinKwon closed pull request #42692: [SPARK-42944][PYTHON][FOLLOW-UP][3.5]
Rename tests from foreachBatch to foreach_batch
URL: https://github.com/apache/spark/pull/42692
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
HyukjinKwon commented on PR #42692:
URL: https://github.com/apache/spark/pull/42692#issuecomment-1694990647
Merged to branch-3.5.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
Hisoka-X commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306888014
##
sql/core/src/test/resources/sql-tests/inputs/try_reflect.sql:
##
@@ -0,0 +1,19 @@
+-- positive
+SELECT try_reflect("java.util.UUID", "fromString",
HyukjinKwon closed pull request #42658: [SPARK-44945][DOCS][PYTHON] Automate
PySpark error class documentation
URL: https://github.com/apache/spark/pull/42658
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
HyukjinKwon commented on PR #42658:
URL: https://github.com/apache/spark/pull/42658#issuecomment-1694976628
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #42694:
URL: https://github.com/apache/spark/pull/42694#issuecomment-1694971316
But they weren't passed to the remote session before.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #42694:
URL: https://github.com/apache/spark/pull/42694#issuecomment-1694971159
Yeah, because the configurations are actually being passed above
zhengruifeng commented on PR #42694:
URL: https://github.com/apache/spark/pull/42694#issuecomment-1694957561
In local mode, does setting all configurations works before
https://github.com/apache/spark/pull/42548 ?
--
This is an automated message from the Apache Git Service.
To respond to
LuciferYang commented on PR #41673:
URL: https://github.com/apache/spark/pull/41673#issuecomment-1694951106
@dongjoon-hyun Do you have time to look at this pr? I found that if we test
the Yarn module with
`-Dtest.exclude.tags=org.apache.spark.tags.ExtendedLevelDBTest`(like adding
some
cloud-fan commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306869761
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala:
##
@@ -55,64 +55,73 @@ import org.apache.spark.util.Utils
LuciferYang opened a new pull request, #42696:
URL: https://github.com/apache/spark/pull/42696
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
LuciferYang commented on code in PR #42518:
URL: https://github.com/apache/spark/pull/42518#discussion_r1306868324
##
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala:
##
@@ -46,18 +47,30 @@ object
yaooqinn commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1694942009
> The second example just shows <> which can't the 'right' version of
whatever this is
It's the source code version, the rendered HTML.
> What happened to the garbled text
Hisoka-X commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306867550
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala:
##
@@ -55,64 +55,73 @@ import org.apache.spark.util.Utils
cloud-fan commented on code in PR #42667:
URL: https://github.com/apache/spark/pull/42667#discussion_r1306867310
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala:
##
@@ -420,17 +420,17 @@ class JacksonParser(
case VALUE_STRING if
HyukjinKwon commented on PR #42694:
URL: https://github.com/apache/spark/pull/42694#issuecomment-1694940489
cc @zhengruifeng @michaelzhan-db
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon commented on PR #42693:
URL: https://github.com/apache/spark/pull/42693#issuecomment-1694940370
cc @zhengruifeng @ueshin @grundprinzip
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
LuciferYang commented on PR #42236:
URL: https://github.com/apache/spark/pull/42236#issuecomment-1694939498
I will merge this pr today If there are no more comments @rangadi
@HyukjinKwon @hvanhovell @zhengruifeng @rangadi @pan3793 @grundprinzip
@zhenlineo
--
This is an automated
HyukjinKwon opened a new pull request, #42695:
URL: https://github.com/apache/spark/pull/42695
### What changes were proposed in this pull request?
This PR proposes to mark all Spark Connect server configurations as static
configurations.
### Why are the changes needed?
cloud-fan commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306864355
##
sql/core/src/test/resources/sql-tests/inputs/try_reflect.sql:
##
@@ -0,0 +1,19 @@
+-- positive
+SELECT try_reflect("java.util.UUID", "fromString",
cloud-fan commented on code in PR #42661:
URL: https://github.com/apache/spark/pull/42661#discussion_r1306863750
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala:
##
@@ -55,64 +55,73 @@ import org.apache.spark.util.Utils
cloud-fan closed pull request #42587: [SPARK-44897][SQL] Propagating local
properties to subquery broadcast exec
URL: https://github.com/apache/spark/pull/42587
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
cloud-fan commented on PR #42587:
URL: https://github.com/apache/spark/pull/42587#issuecomment-1694929266
thanks, merging to master/3.5!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon closed pull request #42678: [SPARK-44963][PYTHON][ML][TESTS] Make
PySpark (pyspark-ml module) tests passing without any optional dependency
URL: https://github.com/apache/spark/pull/42678
--
This is an automated message from the Apache Git Service.
To respond to the message,
HyukjinKwon commented on PR #42678:
URL: https://github.com/apache/spark/pull/42678#issuecomment-1694928402
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon opened a new pull request, #42694:
URL: https://github.com/apache/spark/pull/42694
### What changes were proposed in this pull request?
This PR is a kind of a followup of
https://github.com/apache/spark/pull/42548. This PR proposes to filter static
configurations out in
srowen commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1694922638
I don't quite follow - the error message actually looks as if it's messed up
to begin with and is some bytes being interpreted as Latin-1. The second
example just shows `<>` which can't
HyukjinKwon opened a new pull request, #42693:
URL: https://github.com/apache/spark/pull/42693
### What changes were proposed in this pull request?
This PR fixes the bug in createDataFrame with Python Spark Connect client.
Now it respects inherited namedtuples as below:
beliefer commented on PR #42683:
URL: https://github.com/apache/spark/pull/42683#issuecomment-1694900622
ping @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
zhengruifeng commented on PR #42680:
URL: https://github.com/apache/spark/pull/42680#issuecomment-1694875018
also cc @ueshin
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
srowen commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1694870735
OK I believe this, but it's strange, the garbled text you show does not seem
to be the result of double-escaping HTML? You'd find output that reads like
`` or something
--
This is an
yaooqinn commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1694869333
Hi @srowen, it actually removes double-escaping
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
Hisoka-X commented on PR #42691:
URL: https://github.com/apache/spark/pull/42691#issuecomment-1694856745
Thanks @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #42691:
URL: https://github.com/apache/spark/pull/42691#issuecomment-1694855831
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #42691: [SPARK-44978][SQL][TEST] Fix
SQLQueryTestSuite unable create table normally
URL: https://github.com/apache/spark/pull/42691
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
Hisoka-X commented on code in PR #42691:
URL: https://github.com/apache/spark/pull/42691#discussion_r1306796272
##
sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala:
##
@@ -689,12 +689,10 @@ class SQLQueryTestSuite extends QueryTest with
SharedSparkSession
Hisoka-X commented on PR #41808:
URL: https://github.com/apache/spark/pull/41808#issuecomment-1694846857
@mridulm Thanks for advise! I moved new metrics to the end.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
HyukjinKwon commented on code in PR #42686:
URL: https://github.com/apache/spark/pull/42686#discussion_r1306790110
##
python/pyspark/sql/streaming/listener.py:
##
@@ -477,7 +477,7 @@ def fromJson(cls, j: Dict[str, Any]) ->
"StreamingQueryProgress":
name=j["name"],
HyukjinKwon commented on code in PR #42691:
URL: https://github.com/apache/spark/pull/42691#discussion_r1306781982
##
sql/core/src/test/scala/org/apache/spark/sql/SQLQueryTestSuite.scala:
##
@@ -689,12 +689,10 @@ class SQLQueryTestSuite extends QueryTest with
HyukjinKwon commented on PR #42675:
URL: https://github.com/apache/spark/pull/42675#issuecomment-1694828581
https://github.com/apache/spark/pull/42692
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HyukjinKwon opened a new pull request, #42692:
URL: https://github.com/apache/spark/pull/42692
This PR cherry-picks https://github.com/apache/spark/pull/42675 to
branch-3.5.
---
### What changes were proposed in this pull request?
This PR proposes to rename tests from
github-actions[bot] closed pull request #39819: [SPARK-42252][CORE] Add
`spark.shuffle.localDisk.file.output.buffer` and deprecate
`spark.shuffle.unsafe.file.output.buffer`
URL: https://github.com/apache/spark/pull/39819
--
This is an automated message from the Apache Git Service.
To
github-actions[bot] closed pull request #41139: [SPARK-40887][K8S] Make
`SPARK_DRIVER_LOG_URL_` and `SPARK_DRIVER_ATTRIBUTE_` work for Spark on K8S
URL: https://github.com/apache/spark/pull/41139
--
This is an automated message from the Apache Git Service.
To respond to the message, please
github-actions[bot] closed pull request #41189: [DO NOT MERGE] [POC] run
foreachBatch() on client
URL: https://github.com/apache/spark/pull/41189
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
github-actions[bot] closed pull request #41071: [SPARK-43391][CORE] Idle
connection should be kept when closeIdleConnection is disabled
URL: https://github.com/apache/spark/pull/41071
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
github-actions[bot] closed pull request #41196: [SPARK-43505][K8S] support env
variables substitution and executor library path
URL: https://github.com/apache/spark/pull/41196
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
sadikovi commented on code in PR #42667:
URL: https://github.com/apache/spark/pull/42667#discussion_r1306742597
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/BadRecordException.scala:
##
@@ -65,3 +93,25 @@ case class StringAsDataTypeException(
srowen commented on PR #42674:
URL: https://github.com/apache/spark/pull/42674#issuecomment-1694762651
Sorry for the perhaps silly question, but does this mean that some error
messages or traces are displayed without any escaping? or is this only removing
double-escaping?
--
This is an
sarutak commented on code in PR #42674:
URL: https://github.com/apache/spark/pull/42674#discussion_r1306710065
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/ui/SparkConnectServerPage.scala:
##
@@ -330,24 +328,10 @@ private[ui] class SqlStatsPagedTable(
mridulm commented on PR #42093:
URL: https://github.com/apache/spark/pull/42093#issuecomment-1694726763
Thanks for merging it @yaooqinn !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
mridulm commented on PR #42529:
URL: https://github.com/apache/spark/pull/42529#issuecomment-1694726401
(Sorry for the delay on responding to this)
Just a minor comment, this does look like the right fix.
+CC @tgravescs as well
--
This is an automated message from the Apache Git
mridulm commented on code in PR #42529:
URL: https://github.com/apache/spark/pull/42529#discussion_r1306701075
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:
##
@@ -1618,9 +1618,9 @@ private[spark] object Client extends Logging {
mridulm commented on code in PR #42529:
URL: https://github.com/apache/spark/pull/42529#discussion_r1306701075
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:
##
@@ -1618,9 +1618,9 @@ private[spark] object Client extends Logging {
mridulm commented on PR #42426:
URL: https://github.com/apache/spark/pull/42426#issuecomment-1694723882
Thanks for clarifying @hdaikoku , let me take a look at this PR this week.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
ChenMichael commented on code in PR #42587:
URL: https://github.com/apache/spark/pull/42587#discussion_r1306699621
##
sql/core/src/test/scala/org/apache/spark/sql/internal/ExecutorSideSQLConfSuite.scala:
##
@@ -191,6 +191,52 @@ class ExecutorSideSQLConfSuite extends
mridulm commented on PR #41808:
URL: https://github.com/apache/spark/pull/41808#issuecomment-1694723367
Looking at this more, this should not be that bad an issue as I initially
expected - it will lead to older SHS not seeing the newer metric, but not
result in failures.
cloud-fan commented on PR #42660:
URL: https://github.com/apache/spark/pull/42660#issuecomment-1694595440
late LGTM
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
LuciferYang commented on code in PR #42673:
URL: https://github.com/apache/spark/pull/42673#discussion_r1306613468
##
.github/workflows/build_and_test.yml:
##
@@ -856,7 +864,7 @@ jobs:
- name: Build with SBT
run: |
./dev/change-scala-version.sh 2.13
-
LuciferYang commented on code in PR #42673:
URL: https://github.com/apache/spark/pull/42673#discussion_r1306613214
##
.github/workflows/build_and_test.yml:
##
@@ -856,7 +864,7 @@ jobs:
- name: Build with SBT
run: |
./dev/change-scala-version.sh 2.13
-
LuciferYang commented on PR #42673:
URL: https://github.com/apache/spark/pull/42673#issuecomment-1694590528
Can we try to delete `~/.cache/coursier/v1/https` as well?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
LuciferYang commented on PR #38609:
URL: https://github.com/apache/spark/pull/38609#issuecomment-1694589092
What's your compile command? I think it's not related to this PR. I'm using
M2 Max, and I can build successfully with the default profile.I think M2
doesn't need to use
yaooqinn commented on code in PR #42674:
URL: https://github.com/apache/spark/pull/42674#discussion_r1306604695
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/ui/SparkConnectServerPage.scala:
##
@@ -330,24 +328,10 @@ private[ui] class
Hisoka-X commented on PR #42691:
URL: https://github.com/apache/spark/pull/42691#issuecomment-1694584856
cc @HyukjinKwon Could you take a look? Thanks.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
Hisoka-X opened a new pull request, #42691:
URL: https://github.com/apache/spark/pull/42691
### What changes were proposed in this pull request?
When we repeatedly execute `SQLQueryTestSuite` to generate the golden file,
the warehouse file executed last time is not cleaned up
78 matches
Mail list logo