Hi,

Spark: 2.1.0
Hive: 2.1.1

When starting thrift server, I got the following error:

How can I fix it?

Regards



(error log)
17/08/13 09:28:17 DEBUG client.IsolatedClientLoader: shared class:
java.lang.NoSuchFieldError
17/08/13 09:28:17 DEBUG client.IsolatedClientLoader: shared class:
org.apache.hadoop.security.SecurityUtil
17/08/13 09:28:17 DEBUG client.IsolatedClientLoader: shared class:
org.apache.hadoop.security.SaslRpcServer
17/08/13 09:28:17 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.transport.TSaslTransportException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/transport/TSaslTransportException.class
17/08/13 09:28:17 DEBUG client.IsolatedClientLoader: hive class:
javax.security.sasl.Sasl -
jar:file:/opt/jdk1.8.0_102/jre/lib/rt.jar!/javax/security/sasl/Sasl.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.transport.TMemoryInputTransport -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/transport/TMemoryInputTransport.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.TByteArrayOutputStream -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/TByteArrayOutputStream.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.transport.TSaslTransport$SaslParticipant -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/transport/TSaslTransport$SaslParticipant.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.protocol.TProtocolException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/protocol/TProtocolException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.protocol.TStruct -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/protocol/TStruct.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/ThriftHiveMetastore$Client.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
com.facebook.fb303.FacebookService$Client -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libfb303-0.9.2.jar!/com/facebook/fb303/FacebookService$Client.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.TServiceClient -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/TServiceClient.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.InvalidObjectException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/InvalidObjectException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.UnknownTableException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/UnknownTableException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.UnknownDBException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/UnknownDBException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.ConfigValSecurityException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/ConfigValSecurityException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.UnknownPartitionException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/UnknownPartitionException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.InvalidPartitionException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/InvalidPartitionException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.InvalidInputException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/InvalidInputException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.NoSuchTxnException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/NoSuchTxnException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.TxnAbortedException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/TxnAbortedException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.NoSuchLockException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/NoSuchLockException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.metastore.api.TxnOpenException -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-metastore-1.2.1.spark2.jar!/org/apache/hadoop/hive/metastore/api/TxnOpenException.class
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1 -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/hive-exec-1.2.1.spark2.jar!/org/apache/hadoop/hive/thrift/client/TUGIAssumingTransport$1.class
17/08/13 09:28:18 DEBUG security.UserGroupInformation: PrivilegedAction
as:spark (auth:KERBEROS)
from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: shared class:
org.slf4j.Logger
17/08/13 09:28:18 DEBUG transport.TSaslTransport: opening transport
org.apache.thrift.transport.TSaslClientTransport@7f08caf
17/08/13 09:28:18 DEBUG client.IsolatedClientLoader: hive class:
javax.security.sasl.SaslClient -
jar:file:/opt/jdk1.8.0_102/jre/lib/rt.jar!/javax/security/sasl/SaslClient.class
17/08/13 09:28:18 ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
at
org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at
org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at
org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:47)
at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:81)
at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: GSSException: No valid credentials provided (Mechanism level:
Failed to find any Kerberos tgt)
at
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 75 more
17/08/13 09:28:19 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.transport.TSaslTransport$NegotiationStatus -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/transport/TSaslTransport$NegotiationStatus.class
17/08/13 09:28:19 DEBUG client.IsolatedClientLoader: shared class:
java.lang.Byte
17/08/13 09:28:19 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.EncodingUtils -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/EncodingUtils.class
17/08/13 09:28:19 DEBUG client.IsolatedClientLoader: hive class:
org.apache.thrift.transport.TSaslTransport$SaslRole -
jar:file:/home/spark-2.1.0/assembly/target/scala-2.11/jars/libthrift-0.9.2.jar!/org/apache/thrift/transport/TSaslTransport$SaslRole.class
17/08/13 09:28:19 DEBUG transport.TSaslTransport: CLIENT: Writing message
with status BAD and payload length 19
17/08/13 09:28:19 WARN hive.metastore: Failed to connect to the MetaStore
Server...
org.apache.thrift.transport.TTransportException: GSS initiate failed
at
org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
at
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
at
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
at
org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at
org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at
org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at
org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:47)
at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:81)
at
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/08/13 09:28:19 INFO hive.metastore: Waiting 1 seconds before next
connection attempt.
17/08/13 09:28:20 INFO hive.metastore: Trying to connect to metastore with
URI thrift://n2.server:9083
17/08/13 09:28:20 DEBUG security.UserGroupInformation: PrivilegedAction
as:spark (auth:KERBEROS)

Reply via email to