When you want to move data from external system to hive, this means moving data 
to HDFS first and then point the Hive table to the file in HDFS where you have 
exported the data.
So, you have couple of commands like -copyFromLocal and fget which move the 
file to hdfs. If you intent to move in real time fashion try Flume. But end of 
the day the data movement first happens in HDFS and then hive table can be 
loaded using Load table command.

Regards,
Manish Bhoge
sent by HTC device. Excuse typo.

----- Reply message -----
From: "Cyrille Djoko" <c...@agnik.com>
To: <user@hive.apache.org>
Subject: Hive Queries
Date: Sat, Feb 16, 2013 1:50 AM


Hi Jarcec,
I did try Sqoop. I am running sqoop 1.4.2 --hadoop1.0.0 along with hadoop
1.0.4 But I keep running on the following exception.

Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
class org.apache.hadoop.mapreduce.JobContext, but interface was expected

So I wrote a small program but all I can do is send queries to the server.
> Hi Cyrille,
> I'm not exactly sure what exactly you mean, so I'm more or less blindly
> shooting, but maybe Apache Sqoop [1] might help you?
>
> Jarcec
>
> Links:
> 1: http://sqoop.apache.org/
>
> On Fri, Feb 15, 2013 at 01:44:45PM -0500, Cyrille Djoko wrote:
>> I am looking for a relatively efficient way of transferring data between
>> a
>> remote server and Hive without going through the hassle of storing the
>> data first on memory before loading it to Hive.
>> From what I have read so far there is no such command but it would not
>> hurt to ask.
>> Is it possible to insert data through an insert query in hive? (The
>> equivalent to insert into table_name
>> values (...) in xSQLx)
>>
>> Thank you in advance for an answer.
>>
>>
>> Cyrille Djoko
>> Data Mining Developer Intern
>>
>


Cyrille Djoko

Agnik LLC
Data Mining Developer Intern

Reply via email to