Hi Ronan,

I've seen the problem before. and I remember there were discussions about
that on this mailing list.

There are two jars related thrift, $HADOOP_HOME/lib/libfb303.jar and
$HADOOP_HOME/lib/libthrift.jar
Replace(overwrite) the files with Hive's /libfb303.jar and /libthrift.jar.
and then restart Hive server.

Hope this hepls.

Regards,
Youngwoo

2009/12/14 Ronan Tobin <rto...@specificmedia.com>

>  Thank you – nearly there now....
>
> Server running!
>
>
>
> But now....
>
>
>
> I am getting following error:
>
> Exception in thread "pool-1-thread-2" java.lang.NoSuchMethodError:
> com.facebook.fb303.FacebookService$Processor$ProcessFunction.process(ILorg/apache/thrift/protocol/TProtocol;Lorg/apache/thrift/protocol/TProtocol;)V
>
>         at
> org.apache.hadoop.hive.service.ThriftHive$Processor.process(ThriftHive.java:328)
>
>         at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:252)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:885)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907)
>
>         at java.lang.Thread.run(Thread.java:619)
>
>
>
> I do have the hive libfb303.jar as part of the java class path – even
> switched it with the hadoop libfb303.jar to see if that would work..
>
>
>
> Again – any ideas???
>
>
>
> Thank you
>
>
>
> Ronan
>
>
>
>
>
> *From:* Carl Steinbach [mailto:c...@cloudera.com]
> *Sent:* 14 December 2009 13:18
> *To:* hive-user@hadoop.apache.org
> *Subject:* Re: connecting remotely to hive via jdbc
>
>
>
> Hi Ronan,
>
> Your JDBC connection is failing because there is nothing listening on the
> other end. You have to start the Hive server and tell it which port to
> listen on before you can connect to it. Directions describing how to do this
> are located here: http://wiki.apache.org/hadoop/Hive/HiveServer
>
> Thanks.
>
> Carl
>
> On Mon, Dec 14, 2009 at 4:28 AM, Ronan Tobin <rto...@specificmedia.com>
> wrote:
>
> Hi all
>
> I am totally new to hive.....
>
> I am trying to get a remote connection to hive using jdbc.
>
> I am using the example provided in the documentation.
>
> I have copied all the jars required to my class path.
>
> When I try to make a connection I get the following error:
>
>
>
> ERROR exec.HiveHistory: Unable to create log directory /tmp/<my user name>
>
> Exception in thread "main" *java.sql.SQLException*: *
> org.apache.thrift.transport.TTransportException*: *
> java.net.ConnectException*: Connection refused: connect
>
>       at org.apache.hadoop.hive.jdbc.HiveDriver.connect(*
> HiveDriver.java:111*)
>
>       at java.sql.DriverManager.getConnection(Unknown Source)
>
>       at java.sql.DriverManager.getConnection(Unknown Source)
>
>       at com.org.smedia.HiveJdbcClient.main(*HiveJdbcClient.java:24*)
>
>
>
> I created a hdfs directory for /tmp/<my user name>
>
> And chmod g+w for the directory.
>
> But I still get the same error.
>
>
>
> Class as follows:
>
>
>
> import java.sql.SQLException;
>
> import java.sql.Connection;
>
> import java.sql.ResultSet;
>
> import java.sql.Statement;
>
> import java.sql.DriverManager;
>
>
>
> public class HiveJdbcClient {
>
>   private static String driverName =
> "org.apache.hadoop.hive.jdbc.HiveDriver";
>
>
>
>   /**
>
>    * @param args
>
>    * @throws SQLException
>
>    */
>
>   public static void main(String[] args) throws SQLException {
>
>       try {
>
>       Class.forName(driverName);
>
>     } catch (ClassNotFoundException e) {
>
>       // TODO Auto-generated catch block
>
>       e.printStackTrace();
>
>       System.exit(1);
>
>     }
>
>     Connection con = DriverManager.getConnection("jdbc:hive://<my server ip
> address>:10000/default", "", "");
>
>     Statement stmt = con.createStatement();
>
>     //String tableName = "testHiveDriverTable";
>
>     //stmt.executeQuery("drop table " + tableName);
>
>     //ResultSet res = stmt.executeQuery("create table " + tableName + "
> (key int, value string)");
>
>     // show tables
>
>     //String sql = "show tables '" + tableName + "'";
>
>     //System.out.println("Running: " + sql);
>
>     //res = stmt.executeQuery(sql);
>
>     //if (res.next()) {
>
>       //System.out.println(res.getString(1));
>
>     //}
>
>     // describe table
>
>     //sql = "describe " + tableName;
>
>     //System.out.println("Running: " + sql);
>
>     //res = stmt.executeQuery(sql);
>
>     //while (res.next()) {
>
>       //System.out.println(res.getString(1) + "\t" + res.getString(2));
>
>     //}
>
>
>
>     // load data into table
>
>     // NOTE: filepath has to be local to the hive server
>
>     // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
>
>     //String filepath = "/tmp/a.txt";
>
>     //sql = "load data local inpath '" + filepath + "' into table " +
> tableName;
>
>     //System.out.println("Running: " + sql);
>
>     //res = stmt.executeQuery(sql);
>
>
>
>     // select * query
>
>     String sql = "select * from <my table name>";
>
>     System.out.println("Running: " + sql);
>
>     ResultSet res = stmt.executeQuery(sql);
>
>     while (res.next()) {
>
>       System.out.println(String.valueOf(res.getInt(1)) + "\t" +
> res.getString(2));
>
>     }
>
>
>
>     // regular hive query
>
>     sql = "select count(1) from <my table name>";
>
>     System.out.println("Running: " + sql);
>
>     res = stmt.executeQuery(sql);
>
>     while (res.next()) {
>
>       System.out.println(res.getString(1));
>
>     }
>
>   }
>
> }
>
>
>
> Any ideas??
>
>
>
> Thanks
>
>
>
> R
>
>
>
>
>

Reply via email to