Hi Abshiek
       To transfer data between rdbms and hadoop Sqoop is the preferred and 
recommended option. Once you have the process done in hive the output data can 
be exported to  PG with sqoop export command.

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Abhishek Parolkar <abhis...@viki.com>
Date: Thu, 29 Mar 2012 16:25:08 
To: <user@hive.apache.org>
Reply-To: user@hive.apache.org
Subject: Postgres JDBC + dboutput UDF to export from Hive to remote Postgres

Hi There,
  I am trying to get dboutput() UDF to work so that it can write result to
a PG DB table.

==This is what I did in hive shell==

  add jar /location/hive_contrib.jar;
  add jar /location/postgresql9jdbc3.jar;
  set jdbc.drivers = org.postgresql.Driver;

  CREATE TEMPORARY FUNCTION dboutput
AS  'org.apache.hadoop.hive.contrib.genericudf.example.GenericUDFDBOutput';

  select dboutput('jdbc:postgresql//localhost:5432/test','','','insert into
test_tbl(cnt) values(?)',hex(count(*)))
  from some_hive_table

===========end of snip=======

1.) I am on single node cluster
2.) I am using Hive 0.8.1
3.) I on hadoop 1.0.0
4.) query runs fine but doesnt write to DB, it returns number 2 (
http://screencast.com/t/eavnbBHR1x)

I get no suitable driver error (http://screencast.com/t/OipV14n9FgF) , can
some one tell me how can I load postgres JDBC such
that dboutput recognizes my postgres.

Any help?

-v_abhi_v

Reply via email to