baibaichen opened a new issue, #7437:
URL: https://github.com/apache/incubator-gluten/issues/7437
### Backend
CH (ClickHouse)
### Bug description
```
10/08/2024 03:39:51 AM - INFO - CREATE EXTERNAL TABLE IF NOT EXISTS
call_center
(
cc_call_center_sk INT,
cc_call_center_id string,
cc_rec_start_date DATE,
cc_rec_end_date DATE,
cc_closed_date_sk INT,
cc_open_date_sk INT,
cc_name string,
cc_class string,
cc_employees INT,
cc_sq_ft INT,
cc_hours string,
cc_manager string,
cc_mkt_id INT,
cc_mkt_class string,
cc_mkt_desc string,
cc_market_manager string,
cc_division INT,
cc_division_name string,
cc_company INT,
cc_company_name string,
cc_street_number string,
cc_street_name string,
cc_street_type string,
cc_suite_number string,
cc_city string,
cc_county string,
cc_state string,
cc_zip string,
cc_country string,
cc_gmt_offset DECIMAL(5,2),
cc_tax_percentage DECIMAL(5,2)
)
USING clickhouse
TBLPROPERTIES (storage_policy='s3_main', delta.checkpointInterval=5)
LOCATION
's3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center'
10/08/2024 03:39:59 AM - ERROR -
TExecuteStatementResp(status=TStatus(statusCode=3,
infoMessages=['*org.apache.hive.service.cli.HiveSQLException:Error running
query: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY]
org.apache.spark.sql.delta.DeltaAnalysisException:
[DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not
match the existing properties at
s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center.
== Specified ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_db=tpcds
storage_policy=s3_main
storage_table=call_center
== Existing ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_policy=s3_main
:37:36',
'org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$:runningQueryError:HiveThriftServerErrors.scala:43',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute:SparkExecuteStatementOperation.scala:262',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:runInternal:SparkExecuteStatementOperation.scala:152',
'org.apache.hive.service.cli.operation.Operation:run:Operation.java:277',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkOperation$$super$run:SparkExecuteStatementOperation.scala:41',
'org.apache.spark.sql.hive.thriftserver.SparkOperation:$anonfun$run$1:SparkOperation.scala:45',
'scala.runtime.java8.JFunction0$mcV$sp:apply:JFunction0$mcV$sp.java:23',
'org.apache.spark.sql.hive.thriftserver.SparkOperation:withLocalProperties:SparkOperation.scala:79',
'org.apache.spark.sql.hive.th
riftserver.SparkOperation:withLocalProperties$:SparkOperation.scala:63',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:withLocalProperties:SparkExecuteStatementOperation.scala:41',
'org.apache.spark.sql.hive.thriftserver.SparkOperation:run:SparkOperation.scala:45',
'org.apache.spark.sql.hive.thriftserver.SparkOperation:run$:SparkOperation.scala:43',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:run:SparkExecuteStatementOperation.scala:41',
'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatementInternal:HiveSessionImpl.java:484',
'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatement:HiveSessionImpl.java:460',
'sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2',
'sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:62',
'sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43',
'java.lang.reflect.Method:invoke:Method.java:498', '
org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:71',
'org.apache.hive.service.cli.session.HiveSessionProxy:lambda$invoke$0:HiveSessionProxy.java:58',
'java.security.AccessController:doPrivileged:AccessController.java:-2',
'javax.security.auth.Subject:doAs:Subject.java:422',
'org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1878',
'org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:58',
'com.sun.proxy.$Proxy36:executeStatement::-1',
'org.apache.hive.service.cli.CLIService:executeStatement:CLIService.java:282',
'org.apache.hive.service.cli.thrift.ThriftCLIService:ExecuteStatement:ThriftCLIService.java:453',
'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1557',
'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1542',
'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:38', 'o
rg.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39',
'org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:52',
'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:310',
'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149',
'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624',
'java.lang.Thread:run:Thread.java:750',
'*org.apache.spark.sql.delta.DeltaAnalysisException:[DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY]
The specified properties do not match the existing properties at
s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center.
== Specified ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_db=tpcds
storage_policy=s3_main
storage_table=call_center
== Existing ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_policy=s3_main
:93:57',
'org.apache.spark.sql.delta.DeltaErrorsBase:createTableWithDifferentPropertiesException:DeltaErrors.scala:1306',
'org.apache.spark.sql.delta.DeltaErrorsBase:createTableWithDifferentPropertiesException$:DeltaErrors.scala:1302',
'org.apache.spark.sql.delta.DeltaErrors$:createTableWithDifferentPropertiesException:DeltaErrors.scala:3382',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:verifyTableMetadata:CreateDeltaTableCommand.scala:494',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:createActionsForNewTableOrVerify$1:CreateDeltaTableCommand.scala:342',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:handleCreateTable:CreateDeltaTableCommand.scala:351',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:handleCommit:CreateDeltaTableCommand.scala:170',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:$anonfun$run$2:CreateDeltaTableCommand.scala:110',
'org.apache.spark.sql.delta.metering.DeltaLogging:recordFramePr
ofile:DeltaLogging.scala:168',
'org.apache.spark.sql.delta.metering.DeltaLogging:recordFrameProfile$:DeltaLogging.scala:166',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:recordFrameProfile:CreateDeltaTableCommand.scala:57',
'org.apache.spark.sql.delta.metering.DeltaLogging:$anonfun$recordDeltaOperationInternal$1:DeltaLogging.scala:136',
'com.databricks.spark.util.DatabricksLogging:recordOperation:DatabricksLogging.scala:128',
'com.databricks.spark.util.DatabricksLogging:recordOperation$:DatabricksLogging.scala:117',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:recordOperation:CreateDeltaTableCommand.scala:57',
'org.apache.spark.sql.delta.metering.DeltaLogging:recordDeltaOperationInternal:DeltaLogging.scala:135',
'org.apache.spark.sql.delta.metering.DeltaLogging:recordDeltaOperation:DeltaLogging.scala:125',
'org.apache.spark.sql.delta.metering.DeltaLogging:recordDeltaOperation$:DeltaLogging.scala:115',
'org.apache.spark.sql.delta.commands.CreateDeltaT
ableCommand:recordDeltaOperation:CreateDeltaTableCommand.scala:57',
'org.apache.spark.sql.delta.commands.CreateDeltaTableCommand:run:CreateDeltaTableCommand.scala:110',
'org.apache.spark.sql.execution.datasources.v2.clickhouse.ClickHouseSparkCatalog:org$apache$spark$sql$execution$datasources$v2$clickhouse$ClickHouseSparkCatalog$$createClickHouseTable:ClickHouseSparkCatalog.scala:203',
'org.apache.spark.sql.execution.datasources.v2.clickhouse.ClickHouseSparkCatalog:createTable:ClickHouseSparkCatalog.scala:95',
'org.apache.spark.sql.execution.datasources.v2.clickhouse.ClickHouseSparkCatalog:createTable:ClickHouseSparkCatalog.scala:79',
'org.apache.spark.sql.execution.datasources.v2.CreateTableExec:run:CreateTableExec.scala:44',
'org.apache.spark.sql.execution.datasources.v2.V2CommandExec:result$lzycompute:V2CommandExec.scala:43',
'org.apache.spark.sql.execution.datasources.v2.V2CommandExec:result:V2CommandExec.scala:43',
'org.apache.spark.sql.execution.datasources.v2.V2CommandExec:exe
cuteCollect:V2CommandExec.scala:49',
'org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1:$anonfun$applyOrElse$1:QueryExecution.scala:107',
'org.apache.spark.sql.execution.SQLExecution$:$anonfun$withNewExecutionId$6:SQLExecution.scala:125',
'org.apache.spark.sql.execution.SQLExecution$:withSQLConfPropagated:SQLExecution.scala:201',
'org.apache.spark.sql.execution.SQLExecution$:$anonfun$withNewExecutionId$1:SQLExecution.scala:108',
'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:900',
'org.apache.spark.sql.execution.SQLExecution$:withNewExecutionId:SQLExecution.scala:66',
'org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1:applyOrElse:QueryExecution.scala:107',
'org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1:applyOrElse:QueryExecution.scala:98',
'org.apache.spark.sql.catalyst.trees.TreeNode:$anonfun$transformDownWithPruning$1:TreeNode.scala:461',
'org.apache.spark.sql.catalyst.t
rees.CurrentOrigin$:withOrigin:origin.scala:76',
'org.apache.spark.sql.catalyst.trees.TreeNode:transformDownWithPruning:TreeNode.scala:461',
'org.apache.spark.sql.catalyst.plans.logical.LogicalPlan:org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning:LogicalPlan.scala:32',
'org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper:transformDownWithPruning:AnalysisHelper.scala:267',
'org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper:transformDownWithPruning$:AnalysisHelper.scala:263',
'org.apache.spark.sql.catalyst.plans.logical.LogicalPlan:transformDownWithPruning:LogicalPlan.scala:32',
'org.apache.spark.sql.catalyst.plans.logical.LogicalPlan:transformDownWithPruning:LogicalPlan.scala:32',
'org.apache.spark.sql.catalyst.trees.TreeNode:transformDown:TreeNode.scala:437',
'org.apache.spark.sql.execution.QueryExecution:eagerlyExecuteCommands:QueryExecution.scala:98',
'org.apache.spark.sql.execution.QueryExecution:commandExecuted$lzycompute
:QueryExecution.scala:85',
'org.apache.spark.sql.execution.QueryExecution:commandExecuted:QueryExecution.scala:83',
'org.apache.spark.sql.Dataset:<init>:Dataset.scala:220',
'org.apache.spark.sql.Dataset$:$anonfun$ofRows$2:Dataset.scala:100',
'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:900',
'org.apache.spark.sql.Dataset$:ofRows:Dataset.scala:97',
'org.apache.spark.sql.SparkSession:$anonfun$sql$4:SparkSession.scala:691',
'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:900',
'org.apache.spark.sql.SparkSession:sql:SparkSession.scala:682',
'org.apache.spark.sql.SparkSession:sql:SparkSession.scala:713',
'org.apache.spark.sql.SparkSession:sql:SparkSession.scala:744',
'org.apache.spark.sql.SQLContext:sql:SQLContext.scala:651',
'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute:SparkExecuteStatementOperation.scala:227'],
sqlState='42KD7', errorCode=0, erro
rMessage='Error running query: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY]
org.apache.spark.sql.delta.DeltaAnalysisException:
[DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not
match the existing properties at
s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center.
== Specified ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_db=tpcds
storage_policy=s3_main
storage_table=call_center
== Existing ==
delta.checkpointInterval=5
engine=mergetree
external=true
is_distribute=true
location=s3a://gluten-cicd/dataset/tpcds-sf100-mergetree-partition/call_center
metadata_path=_delta_log
owner=root
provider=clickhouse
sampling_key=
storage_policy=s3_main
'), operationHandle=None)
[2024-10-08T03:39:59.915Z] 10/08/2024 03:39:59 AM - INFO - drop table if
exists catalog_page
```
### Spark version
None
### Spark configurations
_No response_
### System information
_No response_
### Relevant logs
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]