[Moving conversation to sqoop-user@incubator.apache.org. Please subscribe to
this list.]

Please check if the disk partition that holds temporary files or log files
is not full on the task node. Also - please tell us which exact version of
Sqoop and Hadoop you are using, and the output of the command with the
--verbose flag.

Thanks,
Arvind

On Fri, Oct 14, 2011 at 7:45 AM, Patrick <pto...@gmail.com> wrote:

> I'm running a very simple command:
> sqoop import --connect jdbc:mysql://localhost/sqoop_test --table
> sample_data --username root --password <psswd> --m 1 --verbose
>
> And getting this error:
>
> 11/10/14 10:42:11 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 11/10/14 10:42:11 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 11/10/14 10:42:11 DEBUG util.ClassLoaderStack: Restoring classloader:
> sun.misc.Launcher$AppClassLoader@1a45a877
> 11/10/14 10:42:11 ERROR tool.ImportTool: Encountered IOException
> running import job: ENOENT: No such file or directory
>        at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method)
>        at
>
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:
> 496)
>        at
> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:
> 319)
>        at
> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:
> 189)
>        at
>
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:
> 126)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
> 1127)
>        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:
> 833)
>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
>        at
> com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:
> 124)
>        at
> com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:
> 185)
>        at
> com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:
> 413)
>        at
> com.cloudera.sqoop.manager.MySQLManager.importTable(MySQLManager.java:
> 98)
>        at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:
> 383)
>        at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>        at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>        at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>        at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
>
>
> Anyone seen this before or know how to solve it?
>
> I cant tell if its having permission issues with HDFS or it cant write
> temporary files to my local FS. This is on RHEL6 on a CDH3 install
> express edition and the user executing is in the hadoop group (root).
> Sqoop appears to be sucessfully generating and compiling the *.java
> file in the /temp directory too....just no data.
>
> --
> NOTE: The mailing list sqoop-u...@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> subscribe to it by sending an email to
> incubator-sqoop-user-subscr...@apache.org.
>

Reply via email to