I run a cluster on EC2, at the moment I just dump tables to ctrl-A delimited
flat files with INSERT OVERWRITE DIRECTORY and bulk load those to MySQL with
LOAD DATA LOCAL INFILE:

something like this:

LOAD DATA LOCAL INFILE '/mnt/pages.txt'
INTO TABLE new_pages
FIELDS TERMINATED BY 0x01
LINES TERMINATED BY '\n'
(id, url, title, page_latest, total_pageviews, monthly_trend, daily_trend);


There seems to be a bug where ctrl-A is the only delimiter used, regardless
of the Hive table structure (tab delimited etc.)


On Tue, Jul 14, 2009 at 10:45 AM, Edward Capriolo <[email protected]>wrote:

> Min,
>
> Funny should should ask. I was talking to Aaron about sqoop, and
> un-squoop (hadoop->SQL).  I am also started building a dboutput udf
> which I am about to open another thread about. It looks like we may
> open up a hadoop Jira, and possible a Hive Jiira to chat about this.
> Stay tuned!
>
> Edward
>
> On Tue, Jul 14, 2009 at 6:18 AM, Min Zhou<[email protected]> wrote:
> > Hi all,
> >
> > How do you export hive tables into oracle/mysql?  through oci(Oracle Call
> > Interface), jdbc(oci/thin/MysqlDriver) or odbc?
> > Written such tools in c/c++ through HDFS native library or in java(jdbc)?
> > How do you translate hive tables' schema into oracle/mysql shema?
> >
> > Thanks,
> > Min
> > --
> > My research interests are distributed systems, parallel computing and
> > bytecode based virtual machine.
> >
> > My profile:
> > http://www.linkedin.com/in/coderplay
> > My blog:
> > http://coderplay.javaeye.com
> >
>



-- 
Peter N. Skomoroch
617.285.8348
http://www.datawrangling.com
http://delicious.com/pskomoroch
http://twitter.com/peteskomoroch

Reply via email to