cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885232378
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -575,7 +707,31 @@ case class Divide(
override def symbol: String =
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885231906
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885265232
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -109,6 +116,13 @@ private[spark] class TaskSetManager(
private val
sunchao commented on PR #36721:
URL: https://github.com/apache/spark/pull/36721#issuecomment-1141704120
LGTM too, thanks @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885252781
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
weixiuli commented on code in PR #36724:
URL: https://github.com/apache/spark/pull/36724#discussion_r885224233
##
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/CleanupDynamicPruningFilters.scala:
##
@@ -54,7 +54,8 @@ object CleanupDynamicPruningFilters
manuzhang opened a new pull request, #36733:
URL: https://github.com/apache/spark/pull/36733
### What changes were proposed in this pull request?
Currently, bucketed scan is disabled if bucket columns are not in scan
output. This PR proposes to move the check into
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885230648
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -521,6 +651,7 @@ trait DivModLike extends BinaryArithmetic {
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885264474
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -80,12 +82,17 @@ private[spark] class TaskSetManager(
val copiesRunning = new
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885315379
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError:
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885412953
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885231224
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def
LuciferYang opened a new pull request, #36732:
URL: https://github.com/apache/spark/pull/36732
### What changes were proposed in this pull request?
This pr replace `filter(!condition)` with `filterNot(condition)` .
### Why are the changes needed?
Use appropriate api.
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885249138
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
wangyum commented on PR #36410:
URL: https://github.com/apache/spark/pull/36410#issuecomment-1141743095
cc @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
mridulm commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885279237
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -1217,6 +1289,61 @@ private[spark] class TaskSetManager(
def executorAdded(): Unit = {
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885314882
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError:
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885314377
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError:
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885314663
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def
sunchao commented on code in PR #36697:
URL: https://github.com/apache/spark/pull/36697#discussion_r885234555
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2ScanPartitioning.scala:
##
@@ -32,15 +32,15 @@ import
sadikovi commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r885267600
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -472,6 +481,15 @@ object JdbcUtils extends Logging with SQLConfHelper
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885232931
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala:
##
@@ -232,3 +216,33 @@ case class CheckOverflowInSum(
override
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885247661
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -1217,6 +1289,61 @@ private[spark] class TaskSetManager(
def executorAdded(): Unit = {
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885262650
##
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala:
##
@@ -103,6 +104,9 @@ private[spark] class TaskSchedulerImpl(
// of tasks that are very
sadikovi commented on PR #36726:
URL: https://github.com/apache/spark/pull/36726#issuecomment-1141738160
@beliefer Can you review this PR from JDBC perspective? I think you have
contributed extensively to this part of the code. Also, cc @gengliangwang.
--
This is an automated message
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885317774
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -521,6 +651,7 @@ trait DivModLike extends BinaryArithmetic {
Ngone51 commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885377381
##
core/src/main/scala/org/apache/spark/deploy/master/ui/ApplicationPage.scala:
##
@@ -43,8 +43,8 @@ private[ui] class ApplicationPage(parent: MasterWebUI)
extends
wangyum commented on code in PR #36724:
URL: https://github.com/apache/spark/pull/36724#discussion_r885400421
##
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/CleanupDynamicPruningFilters.scala:
##
@@ -54,7 +54,8 @@ object CleanupDynamicPruningFilters
LuciferYang commented on code in PR #36732:
URL: https://github.com/apache/spark/pull/36732#discussion_r885245947
##
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala:
##
@@ -413,7 +413,7 @@ class SQLAppStatusListener(
if (other !=
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885249138
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
AngersZh commented on PR #36730:
URL: https://github.com/apache/spark/pull/36730#issuecomment-1141762594
ping @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
AngersZh commented on PR #36731:
URL: https://github.com/apache/spark/pull/36731#issuecomment-1141762954
ping @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
Ngone51 commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885297634
##
core/src/main/scala/org/apache/spark/deploy/ExecutorDescription.scala:
##
@@ -25,10 +25,13 @@ package org.apache.spark.deploy
private[deploy] class
Ngone51 commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885313120
##
core/src/main/scala/org/apache/spark/deploy/master/ApplicationInfo.scala:
##
@@ -65,7 +66,70 @@ private[spark] class ApplicationInfo(
appSource = new
Ngone51 commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885358948
##
core/src/main/scala/org/apache/spark/deploy/master/ResourceDescription.scala:
##
@@ -0,0 +1,32 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
HyukjinKwon commented on code in PR #36683:
URL: https://github.com/apache/spark/pull/36683#discussion_r885367253
##
python/pyspark/sql/pandas/conversion.py:
##
@@ -596,7 +596,7 @@ def _create_from_pandas_with_arrow(
]
# Slice the DataFrame to be batched
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885645567
##
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala:
##
@@ -853,8 +857,11 @@ private[spark] class TaskSchedulerImpl(
// (taskId, stageId,
ivoson commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885758451
##
core/src/main/scala/org/apache/spark/deploy/master/ResourceDescription.scala:
##
@@ -0,0 +1,32 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885779230
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,12 +622,27 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885784544
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -778,16 +1002,24 @@ case class Pmod(
val javaType =
cloud-fan closed pull request #36730: [SPARK-39342][SQL]
ShowTablePropertiesCommand/ShowTablePropertiesExec should redact properties.
URL: https://github.com/apache/spark/pull/36730
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
cloud-fan commented on PR #36730:
URL: https://github.com/apache/spark/pull/36730#issuecomment-1142300520
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
1104056452 commented on PR #36447:
URL: https://github.com/apache/spark/pull/36447#issuecomment-1142132713
cc @Ngone51 @jiangxb1987 @xuanyuanking, could you please help review this
PR? Thanks.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
cloud-fan commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885671871
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -1401,6 +1408,14 @@ class
cloud-fan commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885677872
##
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala:
##
@@ -486,4 +489,22 @@ object QueryExecution {
val preparationRules =
cloud-fan commented on PR #36689:
URL: https://github.com/apache/spark/pull/36689#issuecomment-1142175473
thanks for the review, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
gongzh021 commented on PR #33457:
URL: https://github.com/apache/spark/pull/33457#issuecomment-1142177922
> @AngersZh Before this change, the status code `500` is returned and
helpful error message is shown if we access to `/jobs` before the UI is
prepared.
cloud-fan commented on PR #36727:
URL: https://github.com/apache/spark/pull/36727#issuecomment-1142194033
The GA job says `org.apache.spark.sql.TPCDSV1_4_PlanStabilitySuite` failed,
but I can't reproduce it locally and this PR definitely won't affect TPCDS
queries. The GA job also says
cloud-fan closed pull request #36727: [SPARK-39340][SQL][3.2] DS v2 agg
pushdown should allow dots in the name of top-level columns
URL: https://github.com/apache/spark/pull/36727
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
Ngone51 commented on PR #36447:
URL: https://github.com/apache/spark/pull/36447#issuecomment-1142218019
cc @pingsutw @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
Ngone51 commented on code in PR #36665:
URL: https://github.com/apache/spark/pull/36665#discussion_r885746852
##
core/src/main/scala/org/apache/spark/scheduler/TaskResultGetter.scala:
##
@@ -102,6 +102,10 @@ private[spark] class TaskResultGetter(sparkEnv: SparkEnv,
scheduler:
cloud-fan commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885675714
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -1376,6 +1376,13 @@ class
cloud-fan commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885674491
##
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala:
##
@@ -486,4 +489,22 @@ object QueryExecution {
val preparationRules =
ivoson commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885759220
##
core/src/main/scala/org/apache/spark/deploy/master/ui/ApplicationPage.scala:
##
@@ -43,8 +43,8 @@ private[ui] class ApplicationPage(parent: MasterWebUI)
extends
cloud-fan closed pull request #36689: [SPARK-39306][SQL] Support scalar
subquery in time travel
URL: https://github.com/apache/spark/pull/36689
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
AngersZh commented on PR #33457:
URL: https://github.com/apache/spark/pull/33457#issuecomment-1142188571
@gongzh021 Maybe you can check this commit
https://github.com/apache/spark/pull/33457/commits/dba26cd5bd1aaacb01e08cfcfef9f02ffe96d018
--
This is an automated message from the
LuciferYang commented on PR #36732:
URL: https://github.com/apache/spark/pull/36732#issuecomment-1142188003
cc @wangyum
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
ivoson commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885722489
##
core/src/main/scala/org/apache/spark/deploy/ExecutorDescription.scala:
##
@@ -25,10 +25,13 @@ package org.apache.spark.deploy
private[deploy] class
ivoson commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885749761
##
core/src/main/scala/org/apache/spark/deploy/master/ApplicationInfo.scala:
##
@@ -65,7 +66,70 @@ private[spark] class ApplicationInfo(
appSource = new
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885776567
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,79 @@ case class Abs(child: Expression, failOnError:
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885774715
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,79 @@ case class Abs(child: Expression, failOnError:
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885782952
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -778,16 +1002,24 @@ case class Pmod(
val javaType =
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885787249
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala:
##
@@ -232,3 +216,36 @@ case class CheckOverflowInSum(
override
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885787936
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -244,8 +314,7 @@ abstract class BinaryArithmetic extends
cloud-fan closed pull request #36731: [SPARK-39343][SQL] DescribeTableExec
should redact properties
URL: https://github.com/apache/spark/pull/36731
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
cloud-fan commented on PR #36731:
URL: https://github.com/apache/spark/pull/36731#issuecomment-1142298692
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885426207
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError:
Ngone51 commented on code in PR #36716:
URL: https://github.com/apache/spark/pull/36716#discussion_r885557272
##
core/src/main/scala/org/apache/spark/deploy/master/ResourceDescription.scala:
##
@@ -0,0 +1,32 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885426684
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885447945
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885448733
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885462208
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885415814
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885417861
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885424345
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -778,16 +999,26 @@ case class Pmod(
val javaType =
MaxGekk commented on PR #36704:
URL: https://github.com/apache/spark/pull/36704#issuecomment-1141914634
cc @srielau
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
codecov-commenter commented on PR #36726:
URL: https://github.com/apache/spark/pull/36726#issuecomment-1141997915
#
[Codecov](https://codecov.io/gh/apache/spark/pull/36726?src=pr=h1_medium=referral_source=github_content=comment_campaign=pr+comments_term=The+Apache+Software+Foundation)
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885529640
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
manuzhang closed pull request #36615: [SPARK-39238][SQL] Apply
WidenSetOperationTypes at last to fix decimal precision loss
URL: https://github.com/apache/spark/pull/36615
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
manuzhang commented on PR #36615:
URL: https://github.com/apache/spark/pull/36615#issuecomment-1142042879
Superseded by https://github.com/apache/spark/pull/36698
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885616172
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
MaxGekk commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885430682
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -666,10 +669,7 @@ abstract class
wangyum closed pull request #36625: [SPARK-39203][SQL] Rewrite table location
to absolute URI based on database URI
URL: https://github.com/apache/spark/pull/36625
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885416511
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +621,26 @@ trait DivModLike extends BinaryArithmetic {
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885450552
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885457521
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -778,16 +999,26 @@ case class Pmod(
val javaType =
wangyum commented on PR #36625:
URL: https://github.com/apache/spark/pull/36625#issuecomment-1141940310
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885533013
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,78 @@ case class Abs(child: Expression, failOnError:
akpatnam25 commented on PR #36601:
URL: https://github.com/apache/spark/pull/36601#issuecomment-1142403811
@HyukjinKwon do you know why this PR build is failing? The build is failing
in code that I did not touch and seems to be working for other contributors. I
have already merged in the
ueshin commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r885963200
##
python/pyspark/sql/session.py:
##
@@ -611,8 +611,8 @@ def _inferSchema(
:class:`pyspark.sql.types.StructType`
"""
first = rdd.first()
-
dongjoon-hyun commented on PR #36689:
URL: https://github.com/apache/spark/pull/36689#issuecomment-1142334174
Thank you, @cloud-fan and all! +1, LGTM.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
ueshin commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r885944686
##
python/pyspark/sql/session.py:
##
@@ -611,8 +611,8 @@ def _inferSchema(
:class:`pyspark.sql.types.StructType`
"""
first = rdd.first()
-
MaxGekk commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885944596
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -1376,6 +1376,13 @@ class KafkaMicroBatchV1SourceSuite
MaxGekk commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885936568
##
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala:
##
@@ -486,4 +489,22 @@ object QueryExecution {
val preparationRules =
MaxGekk commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885953095
##
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala:
##
@@ -486,4 +489,22 @@ object QueryExecution {
val preparationRules =
ueshin commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r885944686
##
python/pyspark/sql/session.py:
##
@@ -611,8 +611,8 @@ def _inferSchema(
:class:`pyspark.sql.types.StructType`
"""
first = rdd.first()
-
MaxGekk commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r885955542
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -666,10 +669,7 @@ abstract class
otterc commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r886137214
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -536,9 +619,11 @@ public MergeStatuses
ueshin commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r886035398
##
python/pyspark/sql/session.py:
##
@@ -611,8 +611,8 @@ def _inferSchema(
:class:`pyspark.sql.types.StructType`
"""
first = rdd.first()
-
otterc commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r837698827
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -88,13 +103,28 @@
private static final ByteBuffer
1 - 100 of 146 matches
Mail list logo