pengzhon-db commented on code in PR #41318:
URL: https://github.com/apache/spark/pull/41318#discussion_r1220439104
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -2576,11 +2578,12 @@ class SparkConnectPlanner(val
xinrong-meng opened a new pull request, #41485:
URL: https://github.com/apache/spark/pull/41485
### What changes were proposed in this pull request?
Use pandas ExtensionDtype for integral Series with Nulls after Arrow to
Pandas conversion.
### Why are the changes needed?
dtenedor opened a new pull request, #41486:
URL: https://github.com/apache/spark/pull/41486
### What changes were proposed in this pull request?
This PR creates error classes for HyperLogLog function call failures.
### Why are the changes needed?
These replace previous
github-actions[bot] commented on PR #40040:
URL: https://github.com/apache/spark/pull/40040#issuecomment-1579642044
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
dtenedor commented on PR #41486:
URL: https://github.com/apache/spark/pull/41486#issuecomment-1579649261
Hi @MaxGekk @RyanBerti can you please help review this PR to improve error
messages and add more test coverage?
--
This is an automated message from the Apache Git Service.
To
zhengruifeng closed pull request #41462: [SPARK-43970][PYTHON][CONNECT] Hide
unsupported dataframe methods from auto-completion
URL: https://github.com/apache/spark/pull/41462
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
cloud-fan closed pull request #40908: [SPARK-42750][SQL] Support Insert By Name
statement
URL: https://github.com/apache/spark/pull/40908
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
Hisoka-X commented on PR #40908:
URL: https://github.com/apache/spark/pull/40908#issuecomment-1579774504
Thanks @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
zhengruifeng commented on PR #41469:
URL: https://github.com/apache/spark/pull/41469#issuecomment-1579788969
> Thank you. Yes, I agree with you. Since the feature freeze is July 16th,
maybe after July 10th?
>
> * https://spark.apache.org/versioning-policy.html
July 10th
rangadi commented on PR #41318:
URL: https://github.com/apache/spark/pull/41318#issuecomment-1579527917
> What kind of unit test are you referring to? We have existing Spark
connect awaittermination unit test. Right now we don't have way to simulate
client disconnect from python side. Do
allisonwang-db commented on code in PR #41321:
URL: https://github.com/apache/spark/pull/41321#discussion_r1220538154
##
python/pyspark/sql/udf.py:
##
@@ -129,18 +127,12 @@ def _create_py_udf(
else useArrow
)
regular_udf = _create_udf(f, returnType,
holdenk commented on PR #41067:
URL: https://github.com/apache/spark/pull/41067#issuecomment-1579604504
I think we can already handle the situation of non JVM usage through the
memory overhead parameters, would that meet your needs or how is this
configuration different from that?
--
dongjoon-hyun commented on PR #41136:
URL: https://github.com/apache/spark/pull/41136#issuecomment-1579756180
Merged to master.
Thank you, @pan3793 and @holdenk .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun closed pull request #41136: [SPARK-43356][K8S] Migrate deprecated
createOrReplace to serverSideApply
URL: https://github.com/apache/spark/pull/41136
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
cloud-fan commented on PR #40908:
URL: https://github.com/apache/spark/pull/40908#issuecomment-1579771865
The GA is known to be unstable due to OOM. I'm merging this PR as the
changed test can pass locally and this new parser feature should not break any
existing tests.
--
This is an
dongjoon-hyun commented on PR #41469:
URL: https://github.com/apache/spark/pull/41469#issuecomment-1579771127
Thank you, @panbingkun . I'm fine any date after July 1st~ Feel free to
proceed after than.
--
This is an automated message from the Apache Git Service.
To respond to the
cloud-fan commented on PR #40908:
URL: https://github.com/apache/spark/pull/40908#issuecomment-1579771976
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on code in PR #41357:
URL: https://github.com/apache/spark/pull/41357#discussion_r1220735232
##
python/pyspark/sql/connect/session.py:
##
@@ -637,6 +638,34 @@ def addArtifacts(self, *path: str, pyfile: bool = False,
archive: bool = False)
dongjoon-hyun closed pull request #41409: [SPARK-43901][SQL] Avro to Support
custom decimal type backed by Long
URL: https://github.com/apache/spark/pull/41409
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
LuciferYang commented on code in PR #41477:
URL: https://github.com/apache/spark/pull/41477#discussion_r1220755529
##
python/pyspark/sql/connect/functions.py:
##
@@ -2373,6 +2374,109 @@ def hours(col: "ColumnOrName") -> Column:
hours.__doc__ = pysparkfuncs.hours.__doc__
+
panbingkun commented on PR #41477:
URL: https://github.com/apache/spark/pull/41477#issuecomment-1579658085
cc @HyukjinKwon @zhengruifeng
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
cloud-fan commented on code in PR #41475:
URL: https://github.com/apache/spark/pull/41475#discussion_r1220669363
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala:
##
@@ -1064,6 +1066,31 @@ trait CheckAnalysis extends PredicateHelper with
cloud-fan commented on code in PR #41475:
URL: https://github.com/apache/spark/pull/41475#discussion_r1220669782
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala:
##
@@ -1064,6 +1066,31 @@ trait CheckAnalysis extends PredicateHelper with
HyukjinKwon commented on code in PR #41415:
URL: https://github.com/apache/spark/pull/41415#discussion_r1220691844
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/artifact/SparkConnectArtifactManager.scala:
##
@@ -154,6 +154,8 @@ class
HyukjinKwon closed pull request #41415: [SPARK-43906][PYTHON][CONNECT]
Implement the file support in SparkSession.addArtifacts
URL: https://github.com/apache/spark/pull/41415
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
HyukjinKwon commented on PR #41415:
URL: https://github.com/apache/spark/pull/41415#issuecomment-1579740202
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on code in PR #41357:
URL: https://github.com/apache/spark/pull/41357#discussion_r1220721480
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/artifact/SparkConnectArtifactManager.scala:
##
@@ -157,10 +159,46 @@ class
zhengruifeng commented on PR #41469:
URL: https://github.com/apache/spark/pull/41469#issuecomment-1579769614
@dongjoon-hyun I am fine with holding on it until July 1st, `buf` release is
a bit frequent.
We may also need to upgrade it to the latest version before 3.5 rc.
--
This
HyukjinKwon commented on code in PR #41357:
URL: https://github.com/apache/spark/pull/41357#discussion_r1220726346
##
python/pyspark/sql/connect/session.py:
##
@@ -637,6 +638,34 @@ def addArtifacts(self, *path: str, pyfile: bool = False,
archive: bool = False)
HyukjinKwon commented on PR #41481:
URL: https://github.com/apache/spark/pull/41481#issuecomment-1579801790
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #41481: [SPARK-43985][PROTOBUF] spark protobuf:
fix enums as ints bug
URL: https://github.com/apache/spark/pull/41481
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
amaliujia commented on code in PR #41461:
URL: https://github.com/apache/spark/pull/41461#discussion_r1220758402
##
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala:
##
@@ -403,6 +397,19 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
dongjoon-hyun commented on PR #41472:
URL: https://github.com/apache/spark/pull/41472#issuecomment-1579355009
Thank you, @viirya .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
amaliujia commented on PR #41475:
URL: https://github.com/apache/spark/pull/41475#issuecomment-1579470009
@cloud-fan tests have passed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
xinrong-meng commented on PR #41321:
URL: https://github.com/apache/spark/pull/41321#issuecomment-1579560755
Merged to master, thanks all!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
holdenk commented on code in PR #41203:
URL: https://github.com/apache/spark/pull/41203#discussion_r1220539520
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ExpectsInputTypes.scala:
##
@@ -74,3 +74,41 @@ object ExpectsInputTypes extends
holdenk commented on PR #41461:
URL: https://github.com/apache/spark/pull/41461#issuecomment-1579609130
I think this makes sense to add the functionality to the Scala/Python APIs
since it's already there in the SQL API. I also appreciate the cleanup of the
duplicated code as well in this
HyukjinKwon commented on code in PR #41415:
URL: https://github.com/apache/spark/pull/41415#discussion_r1220646408
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/artifact/SparkConnectArtifactManager.scala:
##
@@ -154,6 +154,8 @@ class
dongjoon-hyun commented on PR #41469:
URL: https://github.com/apache/spark/pull/41469#issuecomment-1579770666
Thank you. Yes, I agree with you. Since the feature freeze is July 16th,
maybe after July 10th?
- https://spark.apache.org/versioning-policy.html
--
This is an automated
panbingkun commented on PR #41469:
URL: https://github.com/apache/spark/pull/41469#issuecomment-1579770758
@dongjoon-hyun Ok, Let's holding on it until July 1st.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
amaliujia commented on PR #41474:
URL: https://github.com/apache/spark/pull/41474#issuecomment-1579779057
LGTM
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
HyukjinKwon commented on code in PR #41357:
URL: https://github.com/apache/spark/pull/41357#discussion_r1220733748
##
python/pyspark/sql/tests/connect/client/test_artifact.py:
##
@@ -271,6 +277,21 @@ def func(x):
amaliujia commented on code in PR #41461:
URL: https://github.com/apache/spark/pull/41461#discussion_r1220757472
##
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala:
##
@@ -122,16 +122,26 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
pengzhon-db commented on code in PR #41318:
URL: https://github.com/apache/spark/pull/41318#discussion_r1220432414
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -2597,6 +2600,50 @@ class SparkConnectPlanner(val
pengzhon-db commented on code in PR #41318:
URL: https://github.com/apache/spark/pull/41318#discussion_r1220432707
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -2597,6 +2600,50 @@ class SparkConnectPlanner(val
rangadi commented on PR #41481:
URL: https://github.com/apache/spark/pull/41481#issuecomment-1579555258
@gengliangwang, @LuciferYang could one of you merge this?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun commented on PR #40128:
URL: https://github.com/apache/spark/pull/40128#issuecomment-1579579865
Sorry but I'm still in the same position, @shrprasa .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
panbingkun commented on code in PR #41458:
URL: https://github.com/apache/spark/pull/41458#discussion_r1220627181
##
core/src/main/resources/error/error-classes.json:
##
@@ -877,10 +877,24 @@
},
"INSERT_COLUMN_ARITY_MISMATCH" : {
"message" : [
- "Cannot write to
panbingkun commented on code in PR #41458:
URL: https://github.com/apache/spark/pull/41458#discussion_r1220627021
##
core/src/main/resources/error/error-classes.json:
##
@@ -877,10 +877,24 @@
},
"INSERT_COLUMN_ARITY_MISMATCH" : {
"message" : [
- "Cannot write to
HyukjinKwon commented on PR #41435:
URL: https://github.com/apache/spark/pull/41435#issuecomment-1579761629
oh also might need to put them in Python reference doc `.rst` file
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
itholic commented on PR #41456:
URL: https://github.com/apache/spark/pull/41456#issuecomment-1579762660
Yeah, it's only for internal usage so should be fine
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
amaliujia commented on code in PR #41461:
URL: https://github.com/apache/spark/pull/41461#discussion_r1220752992
##
connector/connect/common/src/main/protobuf/spark/connect/catalog.proto:
##
@@ -77,6 +77,8 @@ message ListDatabases {
message ListTables {
// (Optional)
pengzhon-db commented on PR #41318:
URL: https://github.com/apache/spark/pull/41318#issuecomment-1579523627
> Left a few comments. Can we add unit test for this?
What kind of unit test are you referring to? We have existing Spark connect
awaittermination unit test. Right now we don't
amaliujia commented on PR #41477:
URL: https://github.com/apache/spark/pull/41477#issuecomment-1579778345
LGTM
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
justaparth commented on PR #41481:
URL: https://github.com/apache/spark/pull/41481#issuecomment-1579554130
> LGTM. @justaparth do you need to update the descriptor file? Btw, with
#41377, we don't need to.
ah good catch, i updated and added to the PR.
#41377 looks awesome,
xinrong-meng closed pull request #41321: [SPARK-43893][PYTHON][CONNECT]
Non-atomic data type support in Arrow-optimized Python UDF
URL: https://github.com/apache/spark/pull/41321
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
hiboyang commented on PR #10:
URL: https://github.com/apache/spark-connect-go/pull/10#issuecomment-1579578957
@HyukjinKwon @grundprinzip do you have time to take a look at this PR?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
holdenk commented on PR #41136:
URL: https://github.com/apache/spark/pull/41136#issuecomment-1579603278
Looks reasonable to me pending @dongjoon-hyun's review
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #10:
URL: https://github.com/apache/spark-connect-go/pull/10#issuecomment-1579678013
@hiboyang mind creating a JIRA
(https://issues.apache.org/jira/projects/SPARK)?
--
This is an automated message from the Apache Git Service.
To respond to the message, please
amaliujia commented on code in PR #41475:
URL: https://github.com/apache/spark/pull/41475#discussion_r1220673560
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala:
##
@@ -1064,6 +1066,31 @@ trait CheckAnalysis extends PredicateHelper with
zhengruifeng commented on PR #41462:
URL: https://github.com/apache/spark/pull/41462#issuecomment-1579723164
@xinrong-meng thank you! merged to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
zhengruifeng commented on PR #41463:
URL: https://github.com/apache/spark/pull/41463#issuecomment-1579759942
merged to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #41471:
URL: https://github.com/apache/spark/pull/41471#issuecomment-1579759948
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #41471: [SPARK-43615][TESTS][PS][CONNECT]
Enable unit test `test_eval`
URL: https://github.com/apache/spark/pull/41471
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
zhengruifeng closed pull request #41463: [SPARK-43930][SQL][PYTHON][CONNECT]
Add unix_* functions to Scala and Python
URL: https://github.com/apache/spark/pull/41463
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
zhengruifeng commented on code in PR #41435:
URL: https://github.com/apache/spark/pull/41435#discussion_r1220725034
##
python/docs/source/reference/pyspark.sql/functions.rst:
##
@@ -64,31 +64,39 @@ Math Functions
bin
cbrt
ceil
+ceiling
conv
cos
HyukjinKwon commented on code in PR #41357:
URL: https://github.com/apache/spark/pull/41357#discussion_r1220726011
##
python/pyspark/sql/connect/session.py:
##
@@ -625,6 +625,26 @@ def addArtifacts(self, *path: str, pyfile: bool = False,
archive: bool = False)
Hisoka-X commented on code in PR #40908:
URL: https://github.com/apache/spark/pull/40908#discussion_r1219823786
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/statements.scala:
##
@@ -165,19 +165,25 @@ case class QualifiedColType(
*
LuciferYang commented on code in PR #41253:
URL: https://github.com/apache/spark/pull/41253#discussion_r1200149314
##
.github/workflows/build_and_test.yml:
##
@@ -728,6 +729,68 @@ jobs:
./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pmesos -Pkubernetes
-Pvolcano
dongjoon-hyun commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579157021
```
$ build/sbt "sql/testOnly org.apache.spark.sql.connector.*"
...
[info] Run completed in 5 minutes, 10 seconds.
[info] Total number of tests run: 809
[info] Suites:
dongjoon-hyun closed pull request #41449: [SPARK-43959][SQL][TESTS] Make
RowLevelOperationSuiteBase and AlignAssignmentsSuite abstract
URL: https://github.com/apache/spark/pull/41449
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
dongjoon-hyun commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579157126
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #41484:
URL: https://github.com/apache/spark/pull/41484#issuecomment-1579187914
cc @rednaxelafx , @gengliangwang , @kazuyukitanimura
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220151181
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/ExecutePlanHolder.scala:
##
@@ -19,6 +19,18 @@ package
dongjoon-hyun commented on PR #41472:
URL: https://github.com/apache/spark/pull/41472#issuecomment-1579096582
Got it. I double-checked and found more usages
```
$ git grep '\.modifiedConfigs' | grep -v test | grep -v sessionState
dongjoon-hyun commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579158403
Thank you, @aokolnychyi , @cloud-fan , @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
rangadi commented on PR #41377:
URL: https://github.com/apache/spark/pull/41377#issuecomment-1579192292
@LuciferYang could you merge this?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
rangadi commented on PR #41481:
URL: https://github.com/apache/spark/pull/41481#issuecomment-1579191412
LGTM.
@justaparth do you need to update the descriptor file? Btw, with #41377, we
don't need to.
--
This is an automated message from the Apache Git Service.
To respond to the
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220082118
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/Events.scala:
##
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache Software Foundation
dongjoon-hyun commented on PR #41484:
URL: https://github.com/apache/spark/pull/41484#issuecomment-1579260263
Thank you. The following is the manual test result. I'll merge this.
```
[info] StreamingQueryStatusListenerWithDiskStoreSuite:
18:07:13.888 WARN
aokolnychyi commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579308683
Thanks, @dongjoon-hyun @HyukjinKwon @cloud-fan!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
justaparth commented on PR #41481:
URL: https://github.com/apache/spark/pull/41481#issuecomment-1578967268
cc @rangadi @SandishKumarHN
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
hvanhovell commented on code in PR #41482:
URL: https://github.com/apache/spark/pull/41482#discussion_r1219899037
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala:
##
@@ -63,7 +63,7 @@ private[hive] case class HiveSimpleUDF(
// TODO: Finish input output
dongjoon-hyun closed pull request #41472: [SPARK-43976][CORE] Handle the case
where modifiedConfigs doesn't exist in event logs
URL: https://github.com/apache/spark/pull/41472
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
LuciferYang commented on code in PR #41483:
URL: https://github.com/apache/spark/pull/41483#discussion_r1220027667
##
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##
@@ -955,6 +955,7 @@ class ClientE2ETestSuite extends
shrprasa commented on PR #40128:
URL: https://github.com/apache/spark/pull/40128#issuecomment-1579198508
Gentle Ping @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
hvanhovell commented on PR #41425:
URL: https://github.com/apache/spark/pull/41425#issuecomment-1579043028
Merging to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
dongjoon-hyun commented on PR #41472:
URL: https://github.com/apache/spark/pull/41472#issuecomment-1579098191
Let me merge this. I believe this prevents NPEs from all generated cases
ultimately.
--
This is an automated message from the Apache Git Service.
To respond to the message,
dongjoon-hyun commented on PR #41468:
URL: https://github.com/apache/spark/pull/41468#issuecomment-1579164457
The difference is the following. I'll make a follow-up PR quickly.
- https://github.com/apache/spark/pull/41150
--
This is an automated message from the Apache Git Service.
To
dongjoon-hyun commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579113608
It fails again 14 minutes ago. It seems to fail consistently for some
reasons. Let me check the `master` branch status.
![Screenshot 2023-06-06 at 9 45 54
dongjoon-hyun commented on PR #41449:
URL: https://github.com/apache/spark/pull/41449#issuecomment-1579115501
The `master` branch is the same. Let me check this manually and merge. Thank
you for your patience.
--
This is an automated message from the Apache Git Service.
To respond to the
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220082118
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/Events.scala:
##
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache Software Foundation
justaparth opened a new pull request, #41481:
URL: https://github.com/apache/spark/pull/41481
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
LuciferYang opened a new pull request, #41482:
URL: https://github.com/apache/spark/pull/41482
### What changes were proposed in this pull request?
Similar as https://github.com/apache/spark/pull/36720, this pr change to use
`foreach` when `map` doesn't produce results in Spark code,
LuciferYang commented on PR #41482:
URL: https://github.com/apache/spark/pull/41482#issuecomment-1579136907
A wrong push, let me revert it
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
LuciferYang opened a new pull request, #41483:
URL: https://github.com/apache/spark/pull/41483
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220082118
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/Events.scala:
##
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache Software Foundation
juliuszsompolski commented on PR #41440:
URL: https://github.com/apache/spark/pull/41440#issuecomment-1579225428
cc @gengliangwang - I also added the job tags to AppStatusTracker.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220123393
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/Events.scala:
##
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache Software Foundation
jdesjean commented on code in PR #41443:
URL: https://github.com/apache/spark/pull/41443#discussion_r1220151181
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/ExecutePlanHolder.scala:
##
@@ -19,6 +19,18 @@ package
1 - 100 of 206 matches
Mail list logo