[ 
https://issues.apache.org/jira/browse/SPARK-39671?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17600541#comment-17600541
 ] 

Iqbal Singh commented on SPARK-39671:
-------------------------------------

Is there a way to reproduce it, or is this something specific to Cloudera 
distribution only.

 

> insert overwrite table java.lang.NoSuchMethodException: 
> org.apache.hadoop.hive.ql.metadata.Hive.loadPartition .This problem occurred 
> when we installed Apache Spark3.0.1-hadoop3.0 in CDH6.1.1  
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-39671
>                 URL: https://issues.apache.org/jira/browse/SPARK-39671
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1
>            Reporter: xin
>            Priority: Major
>
> use spark-thrifter  run this sql  insert overwrite table xx.xx 
> partition(dt=2022-06-30) select * from xxx.xxx;   The SQL execution 
> environment is cdh 6.1.1  hive  version 2.1.1
>  
>  
>  raise OperationalError(response)  pyhive.exc.OperationalError: 
> TExecuteStatementResp(status=TStatus(statusCode=3, 
> infoMessages=['*org.apache.hive.service.cli.HiveSQLException:Error running 
> query: java.lang.NoSuchMethodException: 
> org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(org.apache.hadoop.fs.Path,
>  java.lang.String, java.util.Map, boolean, boolean, boolean, boolean, 
> boolean, boolean):25:24', 
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute:SparkExecuteStatementOperation.scala:321',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:runInternal:SparkExecuteStatementOperation.scala:202',
>  'org.apache.hive.service.cli.operation.Operation:run:Operation.java:278', 
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkOperation$$super$run:SparkExecuteStatementOperation.scala:46',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkOperation:$anonfun$run$1:SparkOperation.scala:44',
>  'scala.runtime.java8.JFunction0$mcV$sp:apply:JFunction0$mcV$sp.java:23', 
> 'org.apache.spark.sql.hive.thriftserver.SparkOperation:withLocalProperties:SparkOperation.scala:78',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkOperation:withLocalProperties$:SparkOperation.scala:62',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:withLocalProperties:SparkExecuteStatementOperation.scala:46',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkOperation:run:SparkOperation.scala:44',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkOperation:run$:SparkOperation.scala:42',
>  
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:run:SparkExecuteStatementOperation.scala:46',
>  
> 'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatementInternal:HiveSessionImpl.java:484',
>  
> 'org.apache.hive.service.cli.session.HiveSessionImpl:executeStatement:HiveSessionImpl.java:460',
>  
> 'org.apache.hive.service.cli.CLIService:executeStatement:CLIService.java:280',
>  
> 'org.apache.hive.service.cli.thrift.ThriftCLIService:ExecuteStatement:ThriftCLIService.java:439',
>  
> 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1437',
>  
> 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement:getResult:TCLIService.java:1422',
>  'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:38', 
> 'org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39', 
> 'org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:53',
>  
> 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:310',
>  
> 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149',
>  
> 'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624',
>  'java.lang.Thread:run:Thread.java:748', 
> '*java.lang.NoSuchMethodException:org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(org.apache.hadoop.fs.Path,
>  java.lang.String, java.util.Map, boolean, boolean, boolean, boolean, 
> boolean, boolean):63:38', 'java.lang.Class:getMethod:Class.java:1786', 
> 'org.apache.spark.sql.hive.client.Shim:findMethod:HiveShim.scala:177', 
> 'org.apache.spark.sql.hive.client.Shim_v2_1:loadPartitionMethod$lzycompute:HiveShim.scala:1151',
>  
> 'org.apache.spark.sql.hive.client.Shim_v2_1:loadPartitionMethod:HiveShim.scala:1139',
>  
> 'org.apache.spark.sql.hive.client.Shim_v2_1:loadPartition:HiveShim.scala:1201',
>  
> 'org.apache.spark.sql.hive.client.HiveClientImpl:$anonfun$loadPartition$1:HiveClientImpl.scala:872',
>  'scala.runtime.java8.JFunction0$mcV$sp:apply:JFunction0$mcV$sp.java:23', 
> 'org.apache.spark.sql.hive.client.HiveClientImpl:$anonfun$withHiveState$1:HiveClientImpl.scala:294',
>  
> 'org.apache.spark.sql.hive.client.HiveClientImpl:liftedTree1$1:HiveClientImpl.scala:227',
>  
> 'org.apache.spark.sql.hive.client.HiveClientImpl:retryLocked:HiveClientImpl.scala:226',
>  
> 'org.apache.spark.sql.hive.client.HiveClientImpl:withHiveState:HiveClientImpl.scala:276',
>  
> 'org.apache.spark.sql.hive.client.HiveClientImpl:loadPartition:HiveClientImpl.scala:862',
>  
> 'org.apache.spark.sql.hive.HiveExternalCatalog:$anonfun$loadPartition$1:HiveExternalCatalog.scala:915',
>  'scala.runtime.java8.JFunction0$mcV$sp:apply:JFunction0$mcV$sp.java:23', 
> 'org.apache.spark.sql.hive.HiveExternalCatalog:withClient:HiveExternalCatalog.scala:103',
>  
> 'org.apache.spark.sql.hive.HiveExternalCatalog:loadPartition:HiveExternalCatalog.scala:894',
>  
> 'org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener:loadPartition:ExternalCatalogWithListener.scala:179',
>  
> 'org.apache.spark.sql.hive.execution.InsertIntoHiveTable:processInsert:InsertIntoHiveTable.scala:316',
>  
> 'org.apache.spark.sql.hive.execution.InsertIntoHiveTable:run:InsertIntoHiveTable.scala:102',
>  
> 'org.apache.spark.sql.execution.command.DataWritingCommandExec:sideEffectResult$lzycompute:commands.scala:108',
>  
> 'org.apache.spark.sql.execution.command.DataWritingCommandExec:sideEffectResult:commands.scala:106',
>  
> 'org.apache.spark.sql.execution.command.DataWritingCommandExec:executeCollect:commands.scala:120',
>  'org.apache.spark.sql.Dataset:$anonfun$logicalPlan$1:Dataset.scala:229', 
> 'org.apache.spark.sql.Dataset:$anonfun$withAction$1:Dataset.scala:3618', 
> 'org.apache.spark.sql.execution.SQLExecution$:$anonfun$withNewExecutionId$5:SQLExecution.scala:100',
>  
> 'org.apache.spark.sql.execution.SQLExecution$:withSQLConfPropagated:SQLExecution.scala:160',
>  
> 'org.apache.spark.sql.execution.SQLExecution$:$anonfun$withNewExecutionId$1:SQLExecution.scala:87',
>  'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:764', 
> 'org.apache.spark.sql.execution.SQLExecution$:withNewExecutionId:SQLExecution.scala:64',
>  'org.apache.spark.sql.Dataset:withAction:Dataset.scala:3616', 
> 'org.apache.spark.sql.Dataset:<init>:Dataset.scala:229', 
> 'org.apache.spark.sql.Dataset$:$anonfun$ofRows$2:Dataset.scala:100', 
> 'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:764', 
> 'org.apache.spark.sql.Dataset$:ofRows:Dataset.scala:97', 
> 'org.apache.spark.sql.SparkSession:$anonfun$sql$1:SparkSession.scala:607', 
> 'org.apache.spark.sql.SparkSession:withActive:SparkSession.scala:764', 
> 'org.apache.spark.sql.SparkSession:sql:SparkSession.scala:602', 
> 'org.apache.spark.sql.SQLContext:sql:SQLContext.scala:650', 
> 'org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation:org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute:SparkExecuteStatementOperation.scala:280'],
>  sqlState=None, errorCode=0, errorMessage='Error running query: 
> java.lang.NoSuchMethodException: 
> org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(org.apache.hadoop.fs.Path,
>  java.lang.String, java.util.Map, boolean, boolean, boolean, boolean, 
> boolean, boolean)'), operationHandle=None)
>   



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to