You an specify the -config <dir> option directly to Sqoop and it
should be passed down. This is considered a generic argument and must
preceed any tool specific argument. For example:

$ sqoop import -conf /path/to/conf/dir --connect ...

Alternatively you could set the HADOOP_CONF_DIR variable from where
the configuration files will be picked up.

Thanks,
Arvind

On Fri, Oct 14, 2011 at 10:14 AM, Patrick Toole <pto...@gmail.com> wrote:
> Everything is local - just getting sqoop up and running.
> It appears I've solved it, however. There was a problem with my default
> config directory (I was passing in the config dir when using hadoop):
> hadoop --config . fs -ls
> 11/10/14 13:06:20 WARN conf.Configuration: bad conf file: element not
> <property>
> 11/10/14 13:06:20 WARN conf.Configuration: bad conf file: element not
> <property>
> Found 1 items
> drwxr-xr-x   - ptoole hadoop          0 2011-10-14 12:10 /user/ptoole/files
>
> Correcting my hdfs-site.xml clear the warnings along with fixing sqoop.
> However, this leads me to a different question; is there a way for me to
> specify the "--config <dir>" on the sqoop command line (similar to the
> hadoop --config switch)?
>
>
> On Fri, Oct 14, 2011 at 12:55 PM, Arvind Prabhakar <arv...@apache.org>
> wrote:
>>
>> I see you checked the disk space on the machine from where you are
>> running Sqoop. If that is the same machine where you are running
>> Hadoop in a pseudo distributed mode, then please confirm. If not, can
>> you check the disk space on the various partitions on the Hadoop
>> machines (task nodes)?
>>
>> Thanks,
>> Arvind
>>
>> On Fri, Oct 14, 2011 at 9:11 AM, Patrick Toole <pto...@gmail.com> wrote:
>> > Thanks in advance!
>> >
>> > The disk is not full:
>> > [ptoole@ptoole hadoop-sqoop]$ df -h
>> > Filesystem            Size  Used Avail Use% Mounted on
>> > /dev/sda1             455G   75G  358G  18% /
>> > tmpfs                 3.9G  744K  3.9G   1% /dev/shm
>> > Here are my versions:
>> > [ptoole@ptoole hadoop-sqoop]$ sqoop version
>> > Sqoop 1.3.0-cdh3u1
>> > git commit id 3a60cc809b14d538dd1eb0e90ffa9767e8d06a43
>> > Compiled by jenkins@ubuntu-slave01 on Mon Jul 18 08:38:49 PDT 2011
>> > [ptoole@ptoole hadoop-sqoop]$ hadoop version
>> > Hadoop 0.20.2-cdh3u1
>> > Subversion file:///tmp/topdir/BUILD/hadoop-0.20.2-cdh3u1 -r
>> > bdafb1dbffd0d5f2fbc6ee022e1c8df6500fd638
>> > Compiled by root on Mon Jul 18 09:40:26 PDT 2011
>> > From source with checksum 3127e3d410455d2bacbff7673bf3284c
>> > Additional info:
>> > [ptoole@ptoole hadoop-sqoop]$ uname -a
>> > Linux ptoole 2.6.32-131.0.15.el6.x86_64 #1 SMP Tue May 10 15:42:40 EDT
>> > 2011
>> > x86_64 x86_64 x86_64 GNU/Linux
>> > [ptoole@ptoole hadoop-sqoop]$ id ptoole
>> > uid=502(ptoole) gid=503(ptoole) groups=503(ptoole),486(hadoop)
>> > [ptoole@ptoole hadoop-sqoop]$ hadoop --config
>> > /home/ptoole/Desktop/work/opensource/hadoop-sqoop-vertica/hadoop-conf fs
>> > -ls
>> > /user/ptoole
>> > Found 1 items
>> > drwxr-xr-x   - ptoole hadoop          0 2011-10-14 12:10
>> > /user/ptoole/files
>> >
>> > Here's the verbose command:
>> > [ptoole@ptoole hadoop-sqoop]$ sqoop import --connect
>> > jdbc:mysql://localhost/sqoop_test --table sample_data --username root
>> > --password <passwd>  --m 1 --verbose
>> > 11/10/14 12:05:37 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> > 11/10/14 12:05:37 WARN tool.BaseSqoopTool: Setting your password on the
>> > command-line is insecure. Consider using -P instead.
>> > 11/10/14 12:05:37 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/10/14 12:05:37 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/10/14 12:05:37 DEBUG manager.DefaultManagerFactory: Trying with
>> > scheme:
>> > jdbc:mysql:
>> > 11/10/14 12:05:37 INFO manager.MySQLManager: Preparing to use a MySQL
>> > streaming resultset.
>> > 11/10/14 12:05:37 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> > com.cloudera.sqoop.manager.MySQLManager@3c1d332b
>> > 11/10/14 12:05:37 INFO tool.CodeGenTool: Beginning code generation
>> > 11/10/14 12:05:37 DEBUG manager.SqlManager: No connection paramenters
>> > specified. Using regular API for making connection.
>> > 11/10/14 12:05:37 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/10/14 12:05:37 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `sample_data` AS t LIMIT 1
>> > 11/10/14 12:05:37 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/10/14 12:05:37 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `sample_data` AS t LIMIT 1
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter: selected columns:
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter:   example_col
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter: Writing source file:
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.java
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter: Table name: sample_data
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter: Columns: example_col:12,
>> > 11/10/14 12:05:37 DEBUG orm.ClassWriter: sourceFilename is
>> > sample_data.java
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager: Found existing
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/
>> > 11/10/14 12:05:37 INFO orm.CompilationManager: HADOOP_HOME is
>> > /usr/lib/hadoop
>> > 11/10/14 12:05:37 INFO orm.CompilationManager: Found hadoop core jar at:
>> > /usr/lib/hadoop/hadoop-core.jar
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager: Adding source file:
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.java
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager: Invoking javac with
>> > args:
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:   -sourcepath
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:   -d
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:   -classpath
>> > 11/10/14 12:05:37 DEBUG orm.CompilationManager:
>> >
>> > /usr/lib/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.1.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.1.18-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar:/usr/lib/hbase/bin/../conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.90.3-cdh3u1.jar:/usr/lib/hbase/bin/../hbase-0.90.3-cdh3u1-tests.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/asm-3.1.jar:/usr/lib/hbase/bin/../lib/avro-1.3.3.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-net-1.4.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/guava-r06.jar:/usr/lib/hbase/bin/../lib/hadoop-core.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.4.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.4.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.0.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/bin/../lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/thrift-0.2.0.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u1.jar::/usr/lib/hadoop/hadoop-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/10/14 12:05:38 INFO orm.CompilationManager: Writing jar file:
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.jar
>> > 11/10/14 12:05:38 DEBUG orm.CompilationManager: Scanning for .class
>> > files in
>> > directory: /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07
>> > 11/10/14 12:05:38 DEBUG orm.CompilationManager: Got classfile:
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.class
>> > -> sample_data.class
>> > 11/10/14 12:05:38 DEBUG orm.CompilationManager: Finished writing jar
>> > file
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.jar
>> > 11/10/14 12:05:38 WARN manager.MySQLManager: It looks like you are
>> > importing
>> > from mysql.
>> > 11/10/14 12:05:38 WARN manager.MySQLManager: This transfer can be
>> > faster!
>> > Use the --direct
>> > 11/10/14 12:05:38 WARN manager.MySQLManager: option to exercise a
>> > MySQL-specific fast path.
>> > 11/10/14 12:05:38 INFO manager.MySQLManager: Setting zero DATETIME
>> > behavior
>> > to convertToNull (mysql)
>> > 11/10/14 12:05:38 DEBUG manager.MySQLManager: Rewriting connect string
>> > to
>> > jdbc:mysql://localhost/sqoop_test?zeroDateTimeBehavior=convertToNull
>> > 11/10/14 12:05:38 INFO mapreduce.ImportJobBase: Beginning import of
>> > sample_data
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Checking for existing
>> > class:
>> > sample_data
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Attempting to load jar
>> > through URL:
>> >
>> > jar:file:///tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.jar!/
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Previous classloader is
>> > sun.misc.Launcher$AppClassLoader@6d6f0472
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Testing class in jar:
>> > sample_data
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Loaded jar into current
>> > JVM:
>> >
>> > jar:file:///tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.jar!/
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Added classloader for jar
>> >
>> > /tmp/sqoop-ptoole/compile/cb8a4d7e585bc91be77605c4032afe07/sample_data.jar:
>> > java.net.FactoryURLClassLoader@3caa4b
>> > 11/10/14 12:05:38 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/10/14 12:05:38 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `sample_data` AS t LIMIT 1
>> > 11/10/14 12:05:38 DEBUG mapreduce.DataDrivenImportJob: Using table
>> > class:
>> > sample_data
>> > 11/10/14 12:05:38 DEBUG mapreduce.DataDrivenImportJob: Using
>> > InputFormat:
>> > class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.18-bin.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/paranamer-2.3.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jopt-simple-3.2.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.18-bin.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> > 11/10/14 12:05:38 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-1.5.1.jar
>> > 11/10/14 12:05:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with
>> > processName=JobTracker, sessionId=
>> > 11/10/14 12:05:38 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> > library
>> > 11/10/14 12:05:38 DEBUG util.ClassLoaderStack: Restoring classloader:
>> > sun.misc.Launcher$AppClassLoader@6d6f0472
>> > 11/10/14 12:05:38 ERROR tool.ImportTool: Encountered IOException running
>> > import job: ENOENT: No such file or directory
>> > at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method)
>> > at
>> >
>> > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:496)
>> > at
>> >
>> > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:319)
>> > at
>> > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
>> > at
>> >
>> > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:126)
>> > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)
>> > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>> > at java.security.AccessController.doPrivileged(Native Method)
>> > at javax.security.auth.Subject.doAs(Subject.java:396)
>> > at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> > at
>> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
>> > at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
>> > at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
>> > at
>> > com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:121)
>> > at
>> >
>> > com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:181)
>> > at
>> > com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:405)
>> > at
>> >
>> > com.cloudera.sqoop.manager.MySQLManager.importTable(MySQLManager.java:132)
>> > at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:350)
>> > at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
>> > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> > at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >
>> >
>> >
>> > On Fri, Oct 14, 2011 at 11:36 AM, arv...@cloudera.com
>> > <arv...@cloudera.com>
>> > wrote:
>> >>
>> >> [Moving conversation to sqoop-user@incubator.apache.org. Please
>> >> subscribe
>> >> to this list.]
>> >> Please check if the disk partition that holds temporary files or log
>> >> files
>> >> is not full on the task node. Also - please tell us which exact version
>> >> of
>> >> Sqoop and Hadoop you are using, and the output of the command with the
>> >> --verbose flag.
>> >> Thanks,
>> >> Arvind
>> >>
>> >> On Fri, Oct 14, 2011 at 7:45 AM, Patrick <pto...@gmail.com> wrote:
>> >>>
>> >>> I'm running a very simple command:
>> >>> sqoop import --connect jdbc:mysql://localhost/sqoop_test --table
>> >>> sample_data --username root --password <psswd> --m 1 --verbose
>> >>>
>> >>> And getting this error:
>> >>>
>> >>> 11/10/14 10:42:11 INFO jvm.JvmMetrics: Initializing JVM Metrics with
>> >>> processName=JobTracker, sessionId=
>> >>> 11/10/14 10:42:11 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> >>> library
>> >>> 11/10/14 10:42:11 DEBUG util.ClassLoaderStack: Restoring classloader:
>> >>> sun.misc.Launcher$AppClassLoader@1a45a877
>> >>> 11/10/14 10:42:11 ERROR tool.ImportTool: Encountered IOException
>> >>> running import job: ENOENT: No such file or directory
>> >>>        at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method)
>> >>>        at
>> >>>
>> >>>
>> >>> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:
>> >>> 496)
>> >>>        at
>> >>>
>> >>> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:
>> >>> 319)
>> >>>        at
>> >>> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:
>> >>> 189)
>> >>>        at
>> >>>
>> >>>
>> >>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:
>> >>> 126)
>> >>>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)
>> >>>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
>> >>>        at java.security.AccessController.doPrivileged(Native Method)
>> >>>        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >>>        at
>> >>>
>> >>>
>> >>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >>> 1127)
>> >>>        at
>> >>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:
>> >>> 833)
>> >>>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:476)
>> >>>        at
>> >>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:506)
>> >>>        at
>> >>> com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:
>> >>> 124)
>> >>>        at
>> >>>
>> >>> com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:
>> >>> 185)
>> >>>        at
>> >>> com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:
>> >>> 413)
>> >>>        at
>> >>> com.cloudera.sqoop.manager.MySQLManager.importTable(MySQLManager.java:
>> >>> 98)
>> >>>        at
>> >>> com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:
>> >>> 383)
>> >>>        at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
>> >>>        at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>> >>>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >>>        at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>> >>>        at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
>> >>>        at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>> >>>        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
>> >>>
>> >>>
>> >>> Anyone seen this before or know how to solve it?
>> >>>
>> >>> I cant tell if its having permission issues with HDFS or it cant write
>> >>> temporary files to my local FS. This is on RHEL6 on a CDH3 install
>> >>> express edition and the user executing is in the hadoop group (root).
>> >>> Sqoop appears to be sucessfully generating and compiling the *.java
>> >>> file in the /temp directory too....just no data.
>> >>>
>> >>> --
>> >>> NOTE: The mailing list sqoop-u...@cloudera.org is deprecated in favor
>> >>> of
>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> subscribe
>> >>> to it by sending an email to
>> >>> incubator-sqoop-user-subscr...@apache.org.
>> >>
>> >> --
>> >> NOTE: The mailing list sqoop-u...@cloudera.org is deprecated in favor
>> >> of
>> >> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> subscribe
>> >> to it by sending an email to incubator-sqoop-user-subscr...@apache.org.
>> >
>> >
>
>

Reply via email to