Hi,

describe hive table – I have attached the desc hive table

schema – I’m using the SelectHiveQL processor and setting the CSV header to 
true. Then I’m using the CSVReader with Schema Access Strategy – Use String 
Fields from Header.

 

PS - I tried to write to MySQL and it is working fine with both CSVReader and 
AvroReader. For Netezza it is not able to write.    

 

Thanks, 

Mohit

 

From: Pierre Villard <[email protected]> 
Sent: 24 May 2018 13:49
To: [email protected]
Subject: Re: PutDatabaseRecord Error while ingesting to Netezza

 

Hi,

 

Can we have more details: describe of the Hive table and schema you're using in 
the processor?

 

Pierre

 

2018-05-24 8:26 GMT+02:00 Mohit <[email protected] 
<mailto:[email protected]> >:

Hi all,

I’m facing the error while writing the data to Netezza using put database 
records. I get the following error – 

 

PutDatabaseRecord[id=90aad845-0163-1000-0000-000028414a59] Failed to process 
StandardFlowFileRecord[uuid=f06b1510-9f66-42c1-a0f8-29ffc7a4d30f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527141341281-340, container=default, 
section=340], offset=572216, 
length=52],offset=0,name=8068481294062298.0.csv,size=35] due to None of the 
fields in the record map to the columns defined by the test_data table:

 

I’m reading the data from hive and writing to the flowfile in the csv format.

 

Please let me know if I’m doing anything wrong.

 

Thanks,

Mohit

 

cal_dt              string
cal_tm              string
event_typ_cd        string
tndr_typ_cd         string
lgl_entity_nbr       int
lgl_entity_sub_id    int
ntwk_id      int
site_id      int
lane_nbr           int
chanl_cd           string
id_nbr            string
id_sys_nbr        int
id_sys_issuer_nbr int
rd_amt             string
tt_store_sys_ord_amt   string
per_id                 int
tran_nbr            bigint
tran_nbr            bigint
term_id             int
lat_coord         string
long_coord        string
device_present_ind        string
cnsmr_id_cnt        int
loy_cnsmr_id_cnt    int
ndc_ord_qty         int
ndc_ord_amt         string
que_trade_item_qty   int
trade_item_purch_amt        string
pos_trade_item_purch_amt    string
trade_item_purch_qty        int
gtz_trade_item_qty  int
gtz_trade_item_amt  string
vent_tms               string
data_src            string
data_qual_cd    string
mngng_cd          string
batch_nbr            int

Detailed Table Information
Database:               test
Owner:                  hdp-df
CreateTime:             Fri May 25 09:16:35 EDT 2018
LastAccessTime:         UNKNOWN
Protect Mode:           None
Retention:              0
Location:               hdfs://<hdfs path>
Table Type:             EXTERNAL_TABLE
Table Parameters:
       EXTERNAL                TRUE
       numFiles                1
       totalSize               266050048
       transient_lastDdlTime   1527254195

# Storage Information
SerDe Library:          
org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe
InputFormat:            
org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat
OutputFormat:           
org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat
Compressed:             No
Num Buckets:            -1
Bucket Columns:         []
Sort Columns:           []
Storage Desc Params:
       serialization.format    1

Reply via email to