I'm re-posting this question in hopes of it please being noticed, since we
really need to get ESB proxy service message performance data from WSO2
BAM's Cassandra into our PostGreSQL database in order to display it within a
web application that uses PostGreSQL as its back end.

I'm attempting to pull data from Cassandra into PostGreSQL through BAM 
Analytics hive scripts. The Hive table "CassToHive_InterfaceMsgAuditTable" 
is correctly created and filled from its Cassandra table, and the Hive table 
"HiveToPg_InterfaceMsgAuditTable2" successfully created the PostGreSQL table 
"cass_interfacemsgaudittable2" using the Hive table creation statement 
included below. However, when I execute the insert/overwrite statement (also 
shown below), the log shows that 
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.isRowExisting is 
sending a query to PostGreSQL with a table named "null" (see log contents 
below). Where does the problem lie? 

Thank you, 

Pamela 

---------------------------------------------------------------------------- 
----------------------------- 
Hive table creation statement which produced the PostGreSQL table: 
---------------------------------------------------------------------------- 
----------------------------- 
CREATE EXTERNAL TABLE IF NOT EXISTS HiveToPg_InterfaceMsgAuditTable2 ( 
        BAM_messageId STRING, 
        ch_messageId STRING, 
        addedToBamTimestamp BIGINT, 
        status_timestamp BIGINT, 
        delta BIGINT, 
        streamId STRING, 
        interfaceid STRING, 
        messageBody STRING, 
        batch_identifier STRING, 
        batch_element_identifier STRING, 
        proxy_service STRING, 
        sourceIP STRING 
) 
STORED BY 'org.wso2.carbon.hadoop.hive.jdbc.storage.JDBCStorageHandler' 
TBLPROPERTIES ( 
'wso2.carbon.datasource.name'='WSO2_[ourProduct]_DB', 
'hive.jdbc.update.on.duplicate' = 'true', 
'hive.jdbc.primary.key.fields' = 'BAM_messageId', 
'hive.jdbc.table.create.query' = 'CREATE  TABLE 
cass_interfacemsgaudittable2 
( 
  bam_messageid text NOT NULL, 
  ch_messageid text, 
  addedtobamtimestamp bigint, 
  status_timestamp bigint, 
  delta bigint, 
  streamid text, 
  interfaceid text, 
  messagebody text, 
  batch_identifier text, 
  batch_element_identifier text, 
  proxy_service text, 
  sourceip text);' 
); 

---------------------------------------------------------------------------- 
----------------------------- 
Insert/overwrite statement (the select statement produces correct rows): 
---------------------------------------------------------------------------- 
----------------------------- 
INSERT OVERWRITE TABLE HiveToPg_InterfaceMsgAuditTable2 
SELECT 
        BAM_messageId, 
        ch_messageId, 
        addedToBamTimestamp, 
        status_timestamp, 
        (addedToBamTimestamp-status_timestamp) as delta, 
        streamid, 
        interfaceid, 
        messageBody, 
        batch_identifier, 
        batch_element_identifier, 
        proxy_service, 
        sourceIP 
FROM CassToHive_InterfaceMsgAuditTable 
ORDER BY BAM_messageId; 

---------------------------------------------------------------------------- 
----------------------------- 
from postgresql-[dayOfWeek].log, parse error (repeated for each row to be 
inserted): 
---------------------------------------------------------------------------- 
----------------------------- 
PARSE ERROR:  syntax error at or near "null" at character 15 
PARSE STATEMENT:  SELECT * FROM null WHERE bam_messageid=$1 

---------------------------------------------------------------------------- 
----------------------------- 
from bam.log: 
---------------------------------------------------------------------------- 
----------------------------- 
*** YYYY-MM-DD HH:MI:SS,MS PDT|[Thread-80] ERROR DBOperation | Failed to 
write data to database 
org.postgresql.util.PSQLException: ERROR: syntax error at or near "null" 
  Position: 15 
        at 
org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorI 
mpl.java:2102) 
        at 
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.ja 
va:1835) 
        at 
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257) 
        at 
org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.j 
ava:500) 
        at 
org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2St 
atement.java:388) 
        at 
org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statem 
ent.java:273) 
        at 
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.isRowExisting(DBOper 
ation.java:149) 
        at 
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.writeToDB(DBOperatio 
n.java:59) 
        at 
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordWriter.write(DBRecordWri 
ter.java:35) 
        at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.j 
ava:589) 
        at 
org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467) 
        at 
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758) 
        at 
org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.jav 
a:45) 
        at 
org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467) 
        at 
org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:248) 
        at 
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:518) 
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:419) 
        at 
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:257) 
2015-08-12 10:20:01,178 null map = 100%,  reduce = 100% 
Ended Job = job_local_0001 
Execution completed successfully 
Mapred Local Task Succeeded . Convert the Join into MapJoin 
OK




--
View this message in context: 
http://wso2-oxygen-tank.10903.n7.nabble.com/Hive-insert-overwrite-to-a-PostGreSQL-table-failing-due-to-null-table-name-used-by-isRowExisting-metd-tp124033.html
Sent from the WSO2 Development mailing list archive at Nabble.com.
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to