melin opened a new issue #3591:
URL: https://github.com/apache/hudi/issues/3591
Failed to create a hudi table kerberos authentication, No problem creating
the orc table
```
21/09/03 12:28:34 INFO SparkExecuteStatementOperation: Submitting query
'create table bigdata.test_demo_test (
name string comment '',
age int comment ''
)
STORED AS orc
TBLPROPERTIES ('orc.compress'='ZSTD', 'fileFormat'='orc')
' with 0a331fa3-cb28-4c60-8ba8-aace33f0e301
21/09/03 12:28:34 INFO SparkExecuteStatementOperation: Running query with
0a331fa3-cb28-4c60-8ba8-aace33f0e301
21/09/03 12:28:35 INFO DAGScheduler: Asked to cancel job group
0a331fa3-cb28-4c60-8ba8-aace33f0e301
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Close statement with
0a331fa3-cb28-4c60-8ba8-aace33f0e301
21/09/03 12:28:35 INFO ThriftCLIService: Client protocol version:
HIVE_CLI_SERVICE_PROTOCOL_V7
21/09/03 12:28:35 INFO HiveMetaStore: 1: Shutting down the object store...
21/09/03 12:28:35 INFO audit: ugi=hive/[email protected]
ip=unknown-ip-addr cmd=Shutting down the object store...
21/09/03 12:28:35 INFO HiveMetaStore: 1: Metastore shutdown complete.
21/09/03 12:28:35 INFO audit: ugi=hive/[email protected]
ip=unknown-ip-addr cmd=Metastore shutdown complete.
21/09/03 12:28:35 INFO SessionState: Created local directory:
/tmp/a4461f0d-84a0-4aa5-a967-24443027b3e9_resources
21/09/03 12:28:35 INFO SessionState: Created HDFS directory:
/tmp/hive/admin/a4461f0d-84a0-4aa5-a967-24443027b3e9
21/09/03 12:28:35 INFO SessionState: Created local directory:
/tmp/admin/a4461f0d-84a0-4aa5-a967-24443027b3e9
21/09/03 12:28:35 INFO SessionState: Created HDFS directory:
/tmp/hive/admin/a4461f0d-84a0-4aa5-a967-24443027b3e9/_tmp_space.db
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Submitting query
'show table extended like 'test_demo_test'' with
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Running query with
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO CodeGenerator: Code generated in 228.510447 ms
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Result Schema:
StructType(StructField(database,StringType,false),
StructField(tableName,StringType,false),
StructField(isTemporary,BooleanType,false),
StructField(information,StringType,false))
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Received
getNextRowSet request order=FETCH_NEXT and maxRowsL=50 with
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Returning result set
with 1 rows from offsets [0, 1) with c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Received
getNextRowSet request order=FETCH_NEXT and maxRowsL=50 with
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO DAGScheduler: Asked to cancel job group
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:35 INFO SparkExecuteStatementOperation: Close statement with
c71ac5f9-e58e-4474-87f5-262452ba18bf
21/09/03 12:28:48 INFO ThriftCLIService: Client protocol version:
HIVE_CLI_SERVICE_PROTOCOL_V7
21/09/03 12:28:48 INFO HiveMetaStore: 1: Shutting down the object store...
21/09/03 12:28:48 INFO audit: ugi=hive/[email protected]
ip=unknown-ip-addr cmd=Shutting down the object store...
21/09/03 12:28:48 INFO HiveMetaStore: 1: Metastore shutdown complete.
21/09/03 12:28:48 INFO audit: ugi=hive/[email protected]
ip=unknown-ip-addr cmd=Metastore shutdown complete.
21/09/03 12:28:49 INFO SessionState: Created local directory:
/tmp/a58180d1-7865-493a-8be4-503266f0d773_resources
21/09/03 12:28:49 INFO SessionState: Created HDFS directory:
/tmp/hive/admin/a58180d1-7865-493a-8be4-503266f0d773
21/09/03 12:28:49 INFO SessionState: Created local directory:
/tmp/admin/a58180d1-7865-493a-8be4-503266f0d773
21/09/03 12:28:49 INFO SessionState: Created HDFS directory:
/tmp/hive/admin/a58180d1-7865-493a-8be4-503266f0d773/_tmp_space.db
21/09/03 12:28:49 INFO SparkExecuteStatementOperation: Submitting query
'create table bigdata.test_hudi_dt (
id int comment '',
name string comment '',
price double comment '',
ds string comment ''
) USING hudi
OPTIONS (primaryKey = 'id', type = 'cow', 'compression'='ZSTD')
PARTITIONED BY (ds)
' with b10c84b2-d141-48f5-994d-4ed3b930e074
21/09/03 12:28:49 INFO SparkExecuteStatementOperation: Running query with
b10c84b2-d141-48f5-994d-4ed3b930e074
21/09/03 12:28:49 INFO HiveUtils: Initializing HiveMetastoreConnection
version 1.2.1 using Spark classes.
21/09/03 12:28:49 INFO deprecation: mapred.reduce.tasks is deprecated.
Instead, use mapreduce.job.reduces
21/09/03 12:28:49 INFO metastore: Trying to connect to metastore with URI
thrift://hadoop-test-nn1-9-11:9083
21/09/03 12:28:49 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to find
any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95)
at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at
org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:176)
at
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:129)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:301)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:431)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:324)
at
org.apache.spark.sql.hive.HiveClientUtils$.newClientForMetadata(HiveClientUtils.scala:27)
at
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.createHiveDataSourceTable(CreateHoodieTableCommand.scala:212)
at
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.createTableInCatalog(CreateHoodieTableCommand.scala:177)
at
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.run(CreateHoodieTableCommand.scala:73)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at
org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
at
org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3630)
at
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
at
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
at
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
at
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3628)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
at
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:610)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:605)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:280)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.$anonfun$run$1(SparkExecuteStatementOperation.scala:216)
at
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at
org.apache.spark.sql.hive.thriftserver.SparkOperation.withLocalProperties(SparkOperation.scala:78)
at
org.apache.spark.sql.hive.thriftserver.SparkOperation.withLocalProperties$(SparkOperation.scala:62)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:46)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:216)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:211)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at
org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:227)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: GSSException: No valid credentials provided (Mechanism level:
Failed to find any Kerberos tgt)
at
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 75 more
21/09/03 12:28:49 WARN metastore: Failed to connect to the MetaStore
Server...
21/09/03 12:28:49 INFO metastore: Waiting 1 seconds before next connection
attempt.
21/09/03 12:28:50 INFO metastore: Trying to connect to metastore with URI
thrift://hadoop-test-nn1-9-11:9083
21/09/03 12:28:50 ERROR TSaslTransport: SASL negotiation failure
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]