Abe,

Thanks for the info (I guess it wasn't obvious looking at the sqoop client
doc here:  http://sqoop.apache.org/docs/1.99.3/ClientAPI.html, although now
looking back at it, it doesn't actually mention hive anywhere :)).

It does look like it can export regular hdfs file data to mysql (is there
info there on what format the data should be in in hdfs for the export?).
 I didn't see any docs on delimiting or formatting (just an inputDirectory
setting, etc.).

It looks like for now, if we want to export data from hive, we should use
sqoop1 (do you call it 'sqoop1')?

Do you have a rough estimate for the time frame for sqoop2 having feature
parity with sqoop1?

Thanks,

Jason





On Sun, Feb 16, 2014 at 5:53 PM, Abraham Elmahrek <[email protected]> wrote:

> Jason,
>
> Sqoop2 unfortunately doesn't have support for Hive yet. There are a few
> options that the Sqoop2 client has, but not that many. There is the notion
> of "batch processing" commands which was adding in 
> SQOOP-773<https://issues.apache.org/jira/browse/SQOOP-773>. This
> just provides "argument formatted" versions of the interactive menu and
> executes them in batch.
>
> With that being said... the Sqoop community is working hard on bringing
> Sqoop2 up to feature parity with Sqoop1. It is an open source project and
> is always seeking new contributors. Feel free to assist in the efforts.
>
> -Abe
>
>
> On Sat, Feb 15, 2014 at 5:05 PM, Jason Rosenberg <[email protected]> wrote:
>
>> Hi,
>>
>> I'm trying to start using sqoop2.  I'm so far not seeing the complete
>> documentation for describing the arguments for exporting a table from hive,
>> etc.
>>
>> E.g. in sqoop1, you have things like:
>>
>> --map-column-java
>> --fields-terminated-by
>>
>> etc.
>>
>> Is there a simple mapping for providing these from the sqoop2 thin-client
>> api (as compared to the args we would use directly with sqoop1)?.  I
>> haven't poked too far down in the source code (was hoping there's some
>> early documentation on this available?).
>>
>> Currently, I'm getting errors like this (and I assume I need to provide
>> info about types, delimiting, etc.?).
>>
>> org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0018:Error occurs during 
>> loader run
>>      at 
>> org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:229)
>>      at 
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>      at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>      at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>      at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>      at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>      at java.lang.Thread.run(Thread.java:724)
>> Caused by: java.lang.NumberFormatException: For input string: 
>> "RilcQmvw9ielrQUw0GB"
>>      at 
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>      at java.lang.Long.parseLong(Long.java:441)
>>      at java.lang.Long.parseLong(Long.java:483)
>>      at org.apache.sqoop.job.io.Data.parseField(Data.java:449)
>>      at org.apache.sqoop.job.io.Data.parse(Data.java:374)
>>      at org.apache.sqoop.job.io.Data.g
>> java.lang.Throwable: Child Error
>>      at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:250)
>> Caused by: java.io.IOException: Task process exit with nonzero status of 65.
>>      at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:237)
>>
>>
>> Thanks,
>>
>>
>> Jason
>>
>>
>>
>

Reply via email to