[
https://issues.apache.org/jira/browse/SQOOP-583?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Ruslan Al-Fakikh updated SQOOP-583:
-----------------------------------
Description:
I am getting zero exit code when there is a real exception when
running Sqoop Import. The correct exit code (whether it is error or
not) is important for our scheduling system to notify us of any
errors. Should I file a jira issue for this bug?
Here is what I get:
For a regular sqoop command:
{code}
[cloudera@localhost workhive]$ sqoop
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Try 'sqoop help' for usage.
[cloudera@localhost workhive]$ echo $?
1
{code}
So, the error code is correct here
But for the import:
{code}
[cloudera@localhost workhive]$ sqoop import --username
username--password password--hive-import --table ExternalPublisher
--connect jdbc:sqlserver://url:port;databaseName=DBName;
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
the command-line is insecure. Consider using -P instead.
12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
delimiters for output. You can override
12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
--fields-terminated-by, etc.
12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
Microsoft's SQL Server - Hadoop Connector
12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
to /home/cloudera/workhive/./ExternalPublisher.java
java.io.IOException: Destination
'/home/cloudera/workhive/./ExternalPublisher.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
at
com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
ExternalPublisher
12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
12/08/17 20:52:48 ERROR security.UserGroupInformation:
PriviledgedActionException as:cloudera (auth:SIMPLE)
cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
directory ExternalPublisher already exists
12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
running import job:
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
ExternalPublisher already exists
at
org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
at
com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
at
com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
at
com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
at
com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
[cloudera@localhost workhive]$ echo $?
0
{code}
The error code shows success here, which is undesirable. And I am not
interested in why I get FileAlreadyExistsException, I know how to
handle it. The correct error code is more important for maintenance.
was:
I am getting zero exit code when there is a real exception when
running Sqoop Import. The correct exit code (whether it is error or
not) is important for our scheduling system to notify us of any
errors. Should I file a jira issue for this bug?
Here is what I get:
For a regular sqoop command:
[cloudera@localhost workhive]$ sqoop
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Try 'sqoop help' for usage.
[cloudera@localhost workhive]$ echo $?
1
So, the error code is correct here
But for the import:
[cloudera@localhost workhive]$ sqoop import --username
username--password password--hive-import --table ExternalPublisher
--connect jdbc:sqlserver://url:port;databaseName=DBName;
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
the command-line is insecure. Consider using -P instead.
12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
delimiters for output. You can override
12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
--fields-terminated-by, etc.
12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
Microsoft's SQL Server - Hadoop Connector
12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
to /home/cloudera/workhive/./ExternalPublisher.java
java.io.IOException: Destination
'/home/cloudera/workhive/./ExternalPublisher.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
at
com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
ExternalPublisher
12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
SELECT TOP 1 * FROM [ExternalPublisher]
12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
12/08/17 20:52:48 ERROR security.UserGroupInformation:
PriviledgedActionException as:cloudera (auth:SIMPLE)
cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
directory ExternalPublisher already exists
12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
running import job:
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
ExternalPublisher already exists
at
org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
at
com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
at
com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
at
com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
at
com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
[cloudera@localhost workhive]$ echo $?
0
The error code shows success here, which is undesirable. And I am not
interested in why I get FileAlreadyExistsException, I know how to
handle it. The correct error code is more important for maintenance.
> Zero exit code on Exception in sqoop import
> -------------------------------------------
>
> Key: SQOOP-583
> URL: https://issues.apache.org/jira/browse/SQOOP-583
> Project: Sqoop
> Issue Type: Bug
> Affects Versions: 1.3.0
> Reporter: Ruslan Al-Fakikh
>
> I am getting zero exit code when there is a real exception when
> running Sqoop Import. The correct exit code (whether it is error or
> not) is important for our scheduling system to notify us of any
> errors. Should I file a jira issue for this bug?
> Here is what I get:
> For a regular sqoop command:
> {code}
> [cloudera@localhost workhive]$ sqoop
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Try 'sqoop help' for usage.
> [cloudera@localhost workhive]$ echo $?
> 1
> {code}
> So, the error code is correct here
> But for the import:
> {code}
> [cloudera@localhost workhive]$ sqoop import --username
> username--password password--hive-import --table ExternalPublisher
> --connect jdbc:sqlserver://url:port;databaseName=DBName;
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> 12/08/17 20:52:39 WARN tool.BaseSqoopTool: Setting your password on
> the command-line is insecure. Consider using -P instead.
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You can override
> 12/08/17 20:52:39 INFO tool.BaseSqoopTool: delimiters with
> --fields-terminated-by, etc.
> 12/08/17 20:52:39 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 12/08/17 20:52:39 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/08/17 20:52:39 INFO tool.CodeGenTool: Beginning code generation
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:42 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:43 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
> 12/08/17 20:52:43 INFO orm.CompilationManager: Found hadoop core jar
> at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u4-core.jar
> 12/08/17 20:52:45 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.java
> to /home/cloudera/workhive/./ExternalPublisher.java
> java.io.IOException: Destination
> '/home/cloudera/workhive/./ExternalPublisher.java' already exists
> at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
> at
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> at
> com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> 12/08/17 20:52:45 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-cloudera/compile/2c4caabe09a86fbb2055893836660076/ExternalPublisher.jar
> 12/08/17 20:52:45 INFO mapreduce.ImportJobBase: Beginning import of
> ExternalPublisher
> 12/08/17 20:52:46 INFO manager.SqlManager: Executing SQL statement:
> SELECT TOP 1 * FROM [ExternalPublisher]
> 12/08/17 20:52:48 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://localhost/var/lib/hadoop-0.20/cache/mapred/mapred/staging/cloudera/.staging/job_201208072011_0004
> 12/08/17 20:52:48 ERROR security.UserGroupInformation:
> PriviledgedActionException as:cloudera (auth:SIMPLE)
> cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output
> directory ExternalPublisher already exists
> 12/08/17 20:52:48 ERROR tool.ImportTool: Encountered IOException
> running import job:
> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
> ExternalPublisher already exists
> at
> org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:132)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:872)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
> at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
> at
> com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:143)
> at
> com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:203)
> at
> com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:464)
> at
> com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
> at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:383)
> at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> [cloudera@localhost workhive]$ echo $?
> 0
> {code}
> The error code shows success here, which is undesirable. And I am not
> interested in why I get FileAlreadyExistsException, I know how to
> handle it. The correct error code is more important for maintenance.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira