Hey Fero,

Thank you so much for the response. 
I used array_to_string from Postgres, which imported the data in text format in 
Hive.
However, couldn’t query that column as an array to get a range based data. I 
had to write a UDF.

Thanks
Sowjanya

> On Mar 22, 2018, at 9:22 AM, Fero Szabo <f...@cloudera.com> wrote:
> 
> Hi Sowjanya,
> 
> I'm only guessing here, but I don't think that this type (Double Precision 
> array) is supported by Sqoop at all. So, a workaround would be to convert the 
> data to something else in the database (depending on your use case), and then 
> import it with sqoop. Text types are easily imported, though I guess Double 
> too, could work.
> 
> Anyway, the error is thrown by the ClassWriter class which probably means 
> that this issue happens before hive is even touched, at the phase of class 
> generation. You could try using the --map-column-java to get past this, then 
> possibly the --map-column-hive as well... A long shot, but worth a try.
> 
> (Though, then again, you might just end up at a different error message, 
> because this is an array datatype.)
> 
> Kind Regards,
> Fero
> 
> 
> 
> On Tue, Mar 20, 2018 at 10:56 PM, Sowjanya Kakarala <sowja...@agrible.com 
> <mailto:sowja...@agrible.com>> wrote:
> Hi Guys,
> 
> I am trying to migrate one of the tables from Postgres database to Hive.
> 
> Schema of Postgres:
> db=> \d table
>                 Table "public.table"
>  Column  |        Type        | Collation | Nullable | Default 
> ---------+--------------------+-----------+----------+---------
>  cell_id | integer            |           |          | 
>  yr2010  | double precision[] |           |          | 
> 
> Data in yr2010 has 365 days data in it like:
> select yr2010[1:365] from table where cell_id=502070;
> 
> {0,0,0,0,0.07,0.05,0,0,0.04,0,0,0.02,0.09,0.06,0,0,0,0,0,0,0,0,0,0,0.06,0.01,0.4,0.03,0.01,0,0,0,0.01,0,0,0,0,0,0,0.09,1.83,1.76,0,0,0,0,0,0.02,0.02,0,0.01,0.08,0,0,0,0,0.89,0,0,0,0,0,0,0,0,0,0.47,0.07,0.43,0,0,0,0,0,0,0,0.45,0,0,0,0,0,0.08,0,0,0,0.58,0,0,0.4,0,0.78,0,0,1.69,0.09,0,0,0.46,0,0,0.38,0.6,0,0,0,0,0.18,0.21,0.1,0.14,0,0,0,0,0,0.78,0.11,0.57,0.75,0.14,0,0,0,0,0.26,0.77,0.04,0,0,0,0.1,2.05,0,1.26,0,0,0,0,0,0,0,0,0,0.52,0.01,0.65,0.03,0.56,0,0,0,0.94,0.59,0,0,0,0.01,0.08,0.58,0.48,1.37,0,0.26,0,0,0.31,0,0.47,0.72,0,1.09,0.03,0,0,0.02,0,0,0,0,0,0,0,0,1.33,0,0,0.19,0.05,0,0,0.74,0,0,0.14,0.11,0.01,0,0,0.13,0,0,0.02,0,0.76,0,0,0,0.51,0,0,0,0.08,0,0,0.83,0,0,0.07,0,0,0,0.19,0,0,0,0,0.21,0,0,0.69,0,0.14,0,0,0,0,0,0,1.29,0,0,0,0,0,0,0,0,0,0,0.04,0,0,0,0,0,0,0,0,0,0,0,0.01,0,0.02,0.02,0.4,0.04,0.54,0.05,0,0,0,0.45,0,0,0,0,0.48,0,2.21,0.23,0,0,0,0.2,0.23,0.01,0.05,0,0,0,0,0,1.38,0.09,0.01,0,0,0.53,0,0.27,0.67,0,0,0,0.02,0,0,0,0,0,0,0,0,0,0,0,0.66,0.84,0.44,0,0.1,0,0,0,0,0.26,0.08,0,0,0,0.05,0,0,0.97,0,0,0,0.02,0,0.96,0.07,0,0,0.84,0.02,0,0,0,0,0,0.04,0.01,0.02,0.09,0.12,0.28,0.25,0.08,0.16,0,0,0.09,0}
> 
> 
> My Sqoop query is :
> 
> sqoop import --connect 'jdbc:postgresql://path-dev:5432/db? 
> ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory' --table 'table' 
> --columns 'cell_id,yr2010' --username 'uname' -P --hcatalog-database 
> 'db_in_hive' --hcatalog-table 'table_in_hive' --map-column-hive yr2010=double 
> --create-hcatalog-table -m 1
> 
> I tried with
>  --map-column-hive yr2010=double
>  --map-column-hive yr2010=array<double>
> --map-column-hive yr2010=double[]
> 
> Getting the same error for all queries as follows:
> 
> 18/03/19 13:12:13 INFO manager.SqlManager: Executing SQL statement: SELECT 
> t.* FROM "table" AS t LIMIT 1
> 18/03/19 13:12:13 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> 18/03/19 13:12:13 ERROR orm.ClassWriter: No Java type for SQL type 2003 for 
> column yr2010
> 18/03/19 13:12:13 ERROR sqoop.Sqoop: Got exception running Sqoop: 
> java.lang.NullPointerException
> java.lang.NullPointerException
>       at org.apache.sqoop.orm.ClassWriter.parseNullVal(ClassWriter.java:1385)
> 
> 
> 
> Can any one help with the mapping which works better to Sqoop in?
> Appreciate your help.
> 
> Thanks
> Sowjanya
> 
> 
> 
> 
> -- 
> Ferenc Szabo
> Software Engineer
> 

Reply via email to