do you mean putting the below text in <command></command> instead?
On Wed, Jan 30, 2013 at 9:11 PM, Kathleen Ting <[email protected]> wrote: > Hi Nitin, > > Separating spaces is not needed - can you try this instead? > > <arg>import --connect ${SWF_SYNC_TRANS_SOURCE_JDBC_CONNECTION_URL} > --username ${SWF_SYNC_TRANS_SOURCE_HOST_USERNAME} --password > ${SWF_SYNC_TRANS_SOURCE_HOST_PASSWORD} --table > ${SWF_SYNC_TRANS_SOURCE_SYNC_OBJECT_NAME} --where > instance_id=${SWF_SYNC_TRANS_SOURCE_SYNC_DATASET_INSTANCE_ID} > --hive-table > ${SWF_SYNC_TRANS_DESTINATION_SYNC_OBJECT_NAME}_tmp_${WF_WFI_ID} > --columns ${SWF_SYNC_TRANS_SOURCE_DATA_COL_LIST} --hive-import > --create-hive-table select > facility_no,max(asof_yyyymm),max(int_type_cd) as > > int_typ_cd,max(app_sys_no),substr(avg(base_index_cd_rollup),1,5),avg(base_rt_plus_minus_factor),avg(nominal_bank_int_rt) > from test.loans_history_pt where asof_yyyymm = (select > max(asof_yyyymm) from test.loans_history_pt) AND \$CONDITIONS Group by > facility_no order by facility_no</arg> > > Regards, Kathleen > > On Wed, Jan 30, 2013 at 5:21 PM, Nitin kak <[email protected]> wrote: > > I am getting a weird exception on executing this Oozie sqoop action. Any > > clues? > > > > > > <action name="SWF_SYNC_DTRA-SQOOP_IMPORT"> > > <sqoop xmlns="uri:oozie:sqoop-action:0.2"> > > <job-tracker>${jobTracker}</job-tracker> > > <name-node>${nameNode}</name-node> > > <configuration> > > <property> > > <name>sqoop.connection.factories</name> > > > > <value>com.cloudera.sqoop.manager.NetezzaManagerFactory</value> > > </property> > > <property> > > <name>oozie.hive.defaults</name> > > <value>${WF_HIVESITE_PATH}</value> > > </property> > > > > </configuration> > > <arg>import</arg> > > <arg>--connect</arg> > > <arg>${SWF_SYNC_TRANS_SOURCE_JDBC_CONNECTION_URL}</arg> > > <arg>--username</arg> > > <arg>${SWF_SYNC_TRANS_SOURCE_HOST_USERNAME}</arg> > > <arg>--password</arg> > > <arg>${SWF_SYNC_TRANS_SOURCE_HOST_PASSWORD}</arg> > > <arg>--table</arg> > > <arg>${SWF_SYNC_TRANS_SOURCE_SYNC_OBJECT_NAME}</arg> > > <arg>--where</arg> > > > <arg>instance_id=${SWF_SYNC_TRANS_SOURCE_SYNC_DATASET_INSTANCE_ID}</arg> > > <arg>--hive-table</arg> > > > > > <arg>${SWF_SYNC_TRANS_DESTINATION_SYNC_OBJECT_NAME}_tmp_${WF_WFI_ID}</arg> > > <arg>--columns</arg> > > <arg>${SWF_SYNC_TRANS_SOURCE_DATA_COL_LIST}</arg> > > <arg>--hive-import</arg> > > <arg>--create-hive-table</arg> > > > > </sqoop> > > <ok to="SWF_SYNC_DTRA-LOAD_SYNC_TABLE"/> > > <error to="SWF_SYNC_DTRA-LOGEVENT_ERROR"/> > > </action> > > > > > > Here is the stack trace. > > > > > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > org.apache.hadoop.hive.ql.metadata.HiveException: > > javax.jdo.JDOFatalDataStoreException: Cannot get a connection, pool error > > Could not create a validated object, cause: A read-only user or a user > in a > > read-only database is not permitted to disable read-only mode on a > > connection. > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > NestedThrowables: > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool > > error Could not create a validated object, cause: A read-only user or a > user > > in a read-only database is not permitted to disable read-only mode on a > > connection. > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:991) > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:976) > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:7852) > > 148363 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:7251) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:243) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.Driver.compile(Driver.java:430) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > 148364 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > java.lang.reflect.Method.invoke(Method.java:597) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.util.RunJar.main(RunJar.java:208) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - Caused by: > > javax.jdo.JDOFatalDataStoreException: Cannot get a connection, pool error > > Could not create a validated object, cause: A read-only user or a user > in a > > read-only database is not permitted to disable read-only mode on a > > connection. > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > NestedThrowables: > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool > > error Could not create a validated object, cause: A read-only user or a > user > > in a read-only database is not permitted to disable read-only mode on a > > connection. > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:298) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > 148365 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > java.lang.reflect.Method.invoke(Method.java:597) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > javax.jdo.JDOHelper$16.run(JDOHelper.java:1958) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > java.security.AccessController.doPrivileged(Native Method) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > javax.jdo.JDOHelper.invoke(JDOHelper.java:1953) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70) > > 148366 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:407) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:359) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:504) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:266) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:228) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:114) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2111) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2121) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:989) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - ... 20 more > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - Caused by: > > org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool > > error Could not create a validated object, cause: A read-only user or a > user > > in a read-only database is not permitted to disable read-only mode on a > > connection. > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:114) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521) > > 148367 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:588) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - ... 47 more > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - Caused by: > > java.util.NoSuchElementException: Could not create a validated object, > > cause: A read-only user or a user in a read-only database is not > permitted > > to disable read-only mode on a connection. > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1191) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - at > > > org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106) > > 148368 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - ... 57 more > > 150657 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - FAILED: > Error > > in metadata: javax.jdo.JDOFatalDataStoreException: Cannot get a > connection, > > pool error Could not create a validated object, cause: A read-only user > or a > > user in a read-only database is not permitted to disable read-only mode > on a > > connection. > > 150657 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > NestedThrowables: > > 150657 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - > > org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool > > error Could not create a validated object, cause: A read-only user or a > user > > in a read-only database is not permitted to disable read-only mode on a > > connection. > > 150658 [Thread-35] INFO org.apache.sqoop.hive.HiveImport - FAILED: > > Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask > > 150726 [main] ERROR org.apache.sqoop.tool.ImportTool - Encountered > > IOException running import job: java.io.IOException: Hive exited with > status > > 9 > > at > > > org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:364) > > at > org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:314) > > at > org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:226) > > at > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415) > > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476) > > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > at > org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:205) > > at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:174) > > at > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37) > > at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:47) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > > > org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:472) > > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327) > > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) > > at org.apache.hadoop.mapred.Child.main(Child.java:262) >
