Hello everybody,
we're having problems with the Ambari alert for the Spark Thriftserver,
complaining about the Beeline connection to the Thriftserver (see the alert
below):
Connection failed on host master01:10016 (Traceback (most recent call last):
File
"/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/alerts/alert_spark2_thrift_port.py",
line 147, in execute
Execute(cmd, user=sparkuser, path=[beeline_cmd],
timeout=CHECK_COMMAND_TIMEOUT_DEFAULT)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166,
in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py",
line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py",
line 124, in run_action
provider_action()
File
"/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line
263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72,
in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102,
in checked_call
tries=tries, try_sleep=try_sleep,
timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150,
in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314,
in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of '! /usr/hdp/current/spark2-client/bin/beeline -u
'jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary'
-e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL' '
returned 1. Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
java.net.ConnectException: Connection refused (Connection refused)
(state=08S01,code=0)
Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
java.net.ConnectException: Connection refused (Connection refused)
(state=08S01,code=0)
)
Running it with a user the same problem comes up:
dr_who@master01:/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/alerts$
! /usr/hdp/current/spark2-client/bin/beeline -u
'jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary'
-e ''
Connecting to
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/09/30 10:40:26 INFO Utils: Supplied authorities: master01:10016
19/09/30 10:40:26 INFO Utils: Resolved authority: master01:10016
19/09/30 10:40:26 WARN NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
19/09/30 10:40:26 INFO HiveConnection: Will try to open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Could not open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Transport Used for JDBC connection:
binary
Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
java.net.SocketException: Connection reset (state=08S01,code=0)
Beeline version 1.21.2.3.0.1.0-187 by Apache Hive
19/09/30 10:40:27 INFO Utils: Supplied authorities: master01:10016
19/09/30 10:40:27 INFO Utils: Resolved authority: master01:10016
19/09/30 10:40:27 INFO HiveConnection: Will try to open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Could not open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Transport Used for JDBC connection:
binary
Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
java.net.SocketException: Broken pipe (Write failed) (state=08S01,code=0)
When trying to run the same with the Spark user from the command line, a
similar issues arises:
spark@master01:~$ /usr/hdp/current/spark2-client/bin/beeline -u
'jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary'
-e 'show databases'
Connecting to
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 WARN NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection:
binary
Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
Invalid status 21 (state=08S01,code=0)
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection:
binary
No current connection
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with
JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection:
binary
Error: Could not open client transport with JDBC Uri:
jdbc:hive2://master01:10016/default;principal=spark/[email protected];transportMode=binary:
Invalid status 21 (state=08S01,code=0)
Looking at the Thriftserver logs reveals the following:
19/11/13 14:43:48 ERROR TThreadPoolServer: Error occurred during processing of
message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException:
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1710)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException:
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
at
org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at
org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 10 more
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext
connection?
at
sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
at sun.security.ssl.InputRecord.read(InputRecord.java:527)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
at
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:928)
at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
... 16 more
19/11/13 14:43:48 ERROR TThreadPoolServer: Error occurred during processing of
message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException:
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1710)
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException:
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
at
org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at
org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 10 more
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext
connection?
at
sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
at sun.security.ssl.InputRecord.read(InputRecord.java:527)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
at
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:928)
at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
... 16 more
TLS is configured on the Hive side, but still it seems that instead of a TLS
connection Beeline or Spark tries to establish a plaintext connection. Does
anyone know how to resolve this problem? Besides the points mentioned above the
Thriftserver, f.e. when connecting to it via ODBC.