Hi Roberto, How the tool is working? Directly loading data from RDBMS to Phoenix using Sqoop? or loading after bulk file is dumped into HDFS? Thanks & Regards Job M Thomas
________________________________ From: Ravi Kiran [mailto:[email protected]] Sent: Fri 5/30/2014 3:39 AM To: [email protected] Subject: Re: Loading data with Sqoop Hi Roberto, How are you constructing the composite row key and returning the Put from the transformer . Also, can you please throw some light on how you perform a look up on the data types of the columns within the transformer. Regards Ravi On Thu, May 29, 2014 at 3:02 PM, Roberto Gastaldelli <[email protected]> wrote: Hi James, I have extended the PutTransformer I've implemented and now it's loading data into tables with composite primary key. Another scenario I'm still working on is to identify if the table is salted, and load the data accordingly. Can you think in any other scenario? Roberto. On 28/05/2014 6:01 PM, "Roberto Gastaldelli" <[email protected]> wrote: I haven't tested the load in tables with composite key, but I'll run some scenarios and check what can be done. On 28/05/2014 5:51 PM, "James Taylor" <[email protected]> wrote: Hi Roberto, Yes, thank you very much for asking - there's definitely interest. Does it handle the case with a table that has a composite primary key definition? Thanks, James On Wed, May 28, 2014 at 12:45 AM, Roberto Gastaldelli <[email protected]> wrote: Hi there, I came across the challenge of loading data from a RDBMS into a Phoenix table using Sqoop, but that did not work well as Sqoop by default converts all data types to string. I came up with a solution to write a PutTransformer that maps the jdbc data types to the Phoenix native data types. Is there any interest to include this feature to the project? If so, I can contribute. Roberto.
<<winmail.dat>>
