Hi,
I am trying to use beeline to access Hive/Hadoop. Hive server is running on the Linux node that Hadoop is installed as follows: hduser@rhes564::/home/hduser/jobs> hive --service hiveserver 10001 -v & [1] 30025 hduser@rhes564::/home/hduser/jobs> Starting Hive Thrift Server This usage has been deprecated, consider using the new command line syntax (run with -h to see usage information) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hduser/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log 4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hduser/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalon e.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hduser/apache-hive-0.14.0-bin/lib/slf4j-log4j12-1.5.8.jar!/o rg/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Starting hive server on port 10001 with 100 min worker threads and 2147483647 max worker threads Now locally I use beeline to access it but it just does not go further Beeline version 0.14.0 by Apache Hive beeline> !connect jdbc:hive2://localhost:10001/default org.apache.hive.jdbc.HiveDriver scan complete in 13ms Connecting to jdbc:hive2://localhost:10001/default Enter password for jdbc:hive2://localhost:10001/default: SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hduser/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log 4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hduser/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalon e.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Now ikf I go to another Linux host that have hive and Hadoop installed (single node on another Linux host, IP adxdress 50.140.197.216) I get the same issue beeline Beeline version 0.14.0 by Apache Hive beeline> !connect jdbc:hive2://rhes564:10001 "" "" org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://rhes564:10001 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hduser/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log 4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hduser/apache-hive-0.14.0-bin/lib/hive-jdbc-0.14.0-standalon e.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Error: Could not open client transport with JDBC Uri: jdbc:hive2://rhes564:10001: java.net.SocketException: Connection reset (state=08S01,code=0) 0: jdbc:hive2://rhes564:10001 (closed)> !connect jdbc:hive2://rhes564:10001 "" "" org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://rhes564:10001 Now I can see that the connections are established when I do hduser@rhes564::/home/hduser/jobs> netstat -anlp | grep 10001 (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) tcp 0 0 0.0.0.0:10001 0.0.0.0:* LISTEN 30025/java tcp 0 0 127.0.0.1:43507 127.0.0.1:10001 ESTABLISHED 32621/java tcp 0 0 127.0.0.1:10001 127.0.0.1:43507 ESTABLISHED 30025/java tcp 0 0 50.140.197.217:10001 50.140.197.216:50921 ESTABLISHED 30025/java rhes564 has IP adrees 50.140.197.217 and is the host where hiveserver is running. There is one connection from local beeline (see above, Ip address 50.140.197.216) So the connections are established. Just it never gets to the prompt! Any ideas appreciated. Thanks, Mich NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.