comphead opened a new issue, #2551:
URL: https://github.com/apache/datafusion-comet/issues/2551

   ### What is the problem the feature request solves?
   
   - [ ] Comet cannot accelerate Round because: Comet does not support Spark's 
BigDecimal rounding
   - [ ] Comet cannot accelerate Concat because: concat is not supported
   - [ ] Comet cannot accelerate HashAggregateExec because: Unsupported result 
expressions found in: Vector(sum(sales_price#4068)#4217 AS sales#4078, 
sum(return_amt#4204)#4219 AS returns#4080, (sum(profit#4069)#4218 - 
sum(net_loss#4205)#4220) AS profit#4056, store channel AS channel#4232, 
concat(store, s_store_id#483) AS id#4233)
   - [ ] Comet cannot accelerate ScalarSubquery because: Unsupported data type: 
StructType(StructField(count(1),LongType,false),StructField(avg(ss_ext_discount_amt),DoubleType,true),StructField(avg(ss_net_profit),DoubleType,true))
   - [ ] Comet cannot accelerate ShuffledHashJoinExec because: Unsupported join 
type ExistenceJoin(exists#5215)
   - [ ] Comet cannot accelerate HashAggregateExec because: Unsupported 
aggregation mode PartialMerge
   - [ ] Comet cannot accelerate LocalTableScanExec because: LocalTableScan is 
not supported
   - [ ] Comet cannot accelerate Cast because: cast(cs_quantity#152 as 
decimal(12,2)) is not fully compatible with Spark (There can be rounding 
differences). To enable it anyway, set 
spark.comet.expression.Cast.allowIncompatible=true, or set 
spark.comet.expression.allowIncompatible=true to enable all incompatible 
expressions. For more information, refer to the Comet Compatibility Guide 
(https://datafusion.apache.org/comet/user-guide/compatibility.html).
   - [ ] Comet cannot accelerate Upper because: Comet is not compatible with 
Spark for case conversion in locale-specific cases. Set 
spark.comet.caseConversion.enabled=true to enable it anyway.
   - [ ] Comet cannot accelerate BroadcastNestedLoopJoinExec because: 
BroadcastNestedLoopJoin is not supported
   25/10/11 14:48:25 WARN CometSparkSessionExtensions: Comet cannot accelerate 
CartesianProductExec because: CartesianProduct is not supported
   - [ ] Comet cannot accelerate WindowGroupLimitExec because: WindowGroupLimit 
is not supported
   - [ ] Comet cannot accelerate Rank because: rank is not supported
   - [ ] Comet cannot accelerate WindowExec because: Partitioning and sorting 
specifications must be the same.
   - [ ] Comet cannot accelerate WindowExpression because: aggregate 
avg(_w0#18760) is not supported for window function
   - [ ] Comet cannot accelerate AggregateExpression because: 
aggregateexpression is not supported
   - [ ] Comet cannot accelerate Cast because: cast(d_date#22791 as double) is 
not fully compatible with Spark (Does not support inputs ending with 'd' or 
'f'. Does not support 'inf'. Does not support ANSI mode.). To enable it anyway, 
set spark.comet.expression.Cast.allowIncompatible=true, or set 
spark.comet.expression.allowIncompatible=true to enable all incompatible 
expressions. For more information, refer to the Comet Compatibility Guide 
(https://datafusion.apache.org/comet/user-guide/compatibility.html).
   - [ ] Comet cannot accelerate Round because: Comet does not support Spark's 
BigDecimal rounding
   - [ ] Comet cannot accelerate HashAggregateExec because: Unsupported result 
expressions found in: Vector(sum(ss_ext_sales_price#595)#26347 AS sales#26199, 
sum(coalesce(sr_return_amt#551, 0.0))#26350 AS returns#26200, 
sum((ss_net_profit#602 - coalesce(sr_net_loss#559, 0.0)))#26351 AS 
profit#26201, store channel AS channel#26359, concat(store, s_store_id#483) AS 
id#26360)
   
   ### Describe the potential solution
   
   _No response_
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to