Apologies for the much longer response. Thank you Shivram - this is helpful information. One question, albeit rooted MVP design/laziness: for my specific reasons to implement the plugin, there won't be WHERE clauses - can I safely ignore the file you reference knowing that?
The goal is to use PXF to read HAWQ 1.x files directly from HDFS for the purposes of data migration (i.e. all or nothing ingestion). I've begun this work and am currently developing outside the tree while going through the many necessary iterations. The work can be tracked here: https://github.com/kdunn926/incubator-hawq/pull/1/files I'd very much appreciate any extra input and support from this list - Java isn't my native language which is compounded by the fact both MapReduce or PXF are largely uncharted territory for me. Basically all three important components to the success of this work. Thanks, Kyle On Mon, Oct 3, 2016 at 5:17 PM Shivram Mani <[email protected]> wrote: > Apologies for the delayed response. > Apart from updating DataType class, there are additional updates on the > HAWQ/PXF bridge code in C to support more data types (pxffilters.c) > You can refer to https://github.com/apache/incubator-hawq/pull/913/files > which apart from other functionality, we've also added support to handle > timestamp type. > > On Fri, Sep 23, 2016 at 12:07 PM, Kyle Dunn <[email protected]> wrote: > > > Hello- > > > > I'm looking at extending PXF for a new data source and noticed only a > > subset of the HAWQ-supported primitive datatypes are implemented in PXF. > Is > > this as trivial as mapping a type to the corresponding OID in > > "api/io/DataType.java" or is there something more I'm missing? > > > > Thanks, > > > > -Kyle > > -- > > *Kyle Dunn | Data Engineering | Pivotal* > > Direct: 303.905.3171 <(303)%20905-3171> <3039053171 <(303)%20905-3171>> > | Email: [email protected] > > > > > > -- > shivram mani > -- *Kyle Dunn | Data Engineering | Pivotal* Direct: 303.905.3171 <3039053171> | Email: [email protected]
