Hi Chun-fan, thank you very much for sharing the log with us. You are using Microsoft SQL Connector because you're downloaded manually from Microsoft web pages and you can also confirm that by following log lines:
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver ... > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd I'm not sure what is going wrong as it seems that the data were parsed correctly, but submitting the query to SQL server will fail. As a next step I would recommend turning Microsoft Connector off and using build-in one instead to see if the issue is specific to Sqoop or the Connector. You can do that by temporarily moving file /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else. Jarcec On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote: > Thank you, Jarcec. I'm not sure which connector we use. I've downloaded > "Microsoft SQL Server Connector for Apache Hadoop" from > http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I didn't > remember if we really used that. How to make sure? > > And here is the verbose log: > > =========== > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging. > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory: > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory: > com.cloudera.sqoop.manager.DefaultManagerFactory > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory: > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using > Microsoft's SQL Server - Hadoop Connector > 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of 1000 > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters > specified. Using regular API for making connection. > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query: > 1000 > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT > TOP 1 * FROM [member_main] > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query: > 1000 > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT > TOP 1 * FROM [member_main] > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns: > 12/12/05 12:08:57 DEBUG orm.ClassWriter: MemberId > 12/12/05 12:08:57 DEBUG orm.ClassWriter: USERNAME > 12/12/05 12:08:57 DEBUG orm.ClassWriter: FirstName > 12/12/05 12:08:57 DEBUG orm.ClassWriter: LastName > 12/12/05 12:08:57 DEBUG orm.ClassWriter: EmailAddress > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Password > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Password_E5 > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Birthday > 12/12/05 12:08:57 DEBUG orm.ClassWriter: CompanyName > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Gender > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Age > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Education > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Country > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Title > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Phone1 > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Phone2 > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Fax > 12/12/05 12:08:57 DEBUG orm.ClassWriter: State > 12/12/05 12:08:57 DEBUG orm.ClassWriter: City > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Address1 > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Address2 > 12/12/05 12:08:57 DEBUG orm.ClassWriter: ZipCode > 12/12/05 12:08:57 DEBUG orm.ClassWriter: VATID > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Language > 12/12/05 12:08:57 DEBUG orm.ClassWriter: rec_letter > 12/12/05 12:08:57 DEBUG orm.ClassWriter: rec_promotion > 12/12/05 12:08:57 DEBUG orm.ClassWriter: rec_type > 12/12/05 12:08:57 DEBUG orm.ClassWriter: JointSource > 12/12/05 12:08:57 DEBUG orm.ClassWriter: CustomerLevel > 12/12/05 12:08:57 DEBUG orm.ClassWriter: UpdateDate > 12/12/05 12:08:57 DEBUG orm.ClassWriter: CreateDate > 12/12/05 12:08:57 DEBUG orm.ClassWriter: FirstLoginDate > 12/12/05 12:08:57 DEBUG orm.ClassWriter: LastLoginDate > 12/12/05 12:08:57 DEBUG orm.ClassWriter: LastVisit > 12/12/05 12:08:57 DEBUG orm.ClassWriter: isValid > 12/12/05 12:08:57 DEBUG orm.ClassWriter: nJoint > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Upd_SubDate > 12/12/05 12:08:57 DEBUG orm.ClassWriter: UnSub_Type > 12/12/05 12:08:57 DEBUG orm.ClassWriter: CreateDateFloat > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4, USERNAME:12, > FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12, > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5, > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9, > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7, > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4, > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93, > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4, > CreateDateFloat:8, > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is member_main.java > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/ > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is > /usr/local/hadoop/libexec/.. > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with args: > 12/12/05 12:08:57 DEBUG orm.CompilationManager: -sourcepath > 12/12/05 12:08:57 DEBUG orm.CompilationManager: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/ > 12/12/05 12:08:57 DEBUG orm.CompilationManager: -d > 12/12/05 12:08:57 DEBUG orm.CompilationManager: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/ > 12/12/05 12:08:57 DEBUG orm.CompilationManager: -classpath > 12/12/05 12:08:57 DEBUG orm.CompilationManager: > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java > org.apache.commons.io.FileExistsException: Destination > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists > at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378) > at > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229) > at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85) > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66) > at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99) > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182) > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221) > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230) > at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239) > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class files > in directory: /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971 > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile: > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class > -> member_main.class > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar file > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of > member_main > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class > com.cloudera.sqoop.mapreduce.ExportInputFormat > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next query: > 1000 > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement: SELECT > TOP 1 * FROM [member_main] > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/sqljdbc4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/avro-1.5.4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/sqljdbc4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/paranamer-2.3.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/commons-io-1.4.jar > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1 > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input bytes=2611 > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2611 > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the native-hadoop > library > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not loaded > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated splits: > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:; > 12/12/05 12:09:00 INFO mapred.JobClient: Running job: job_201212041541_0107 > 12/12/05 12:09:01 INFO mapred.JobClient: map 0% reduce 0% > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id : > attempt_201212041541_0107_m_000000_0, Status : FAILED > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException: > Incorrect syntax near ','. > at > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195) > at > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect > syntax near ','. > at > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340) > at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575) > at > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322) > at > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234) > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id : > attempt_201212041541_0107_m_000000_1, Status : FAILED > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException: > Incorrect syntax near ','. > at > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195) > at > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect > syntax near ','. > at > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340) > at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575) > at > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322) > at > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234) > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id : > attempt_201212041541_0107_m_000000_2, Status : FAILED > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException: > Incorrect syntax near ','. > at > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195) > at > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > at org.apache.hadoop.mapred.Child.main(Child.java:249) > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect > syntax near ','. > at > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340) > at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575) > at > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179) > at > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154) > at > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322) > at > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234) > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete: job_201212041541_0107 > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8 > 12/12/05 12:09:41 INFO mapred.JobClient: Job Counters > 12/12/05 12:09:41 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=24379 > 12/12/05 12:09:41 INFO mapred.JobClient: Total time spent by all > reduces waiting after reserving slots (ms)=0 > 12/12/05 12:09:41 INFO mapred.JobClient: Total time spent by all maps > waiting after reserving slots (ms)=0 > 12/12/05 12:09:41 INFO mapred.JobClient: Rack-local map tasks=3 > 12/12/05 12:09:41 INFO mapred.JobClient: Launched map tasks=4 > 12/12/05 12:09:41 INFO mapred.JobClient: Data-local map tasks=1 > 12/12/05 12:09:41 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 > 12/12/05 12:09:41 INFO mapred.JobClient: Failed map tasks=1 > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes in > 43.0875 seconds (0 bytes/sec) > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records. > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export job > failed! > ================ > > Kind regards, > Chun-fan > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <[email protected]>wrote: > > > HiChun-fan, > > would you mind sharing with us entire Sqoop log generated with parameter > > --verbose? Are you using build-in Microsoft SQL Connector or connector > > provided by Microsoft? > > > > Jarcec > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote: > > > Hi, > > > > > > > > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3. > > > > > > > > > > > > We encountered the following error when we try to export HDFS file into > > > MSSQL 2005 (partially): > > > > > > > > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id : > > > attempt_201212041541_0014_m_000000_2, Status : FAILED > > > > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException: > > > Incorrect syntax near ','. > > > > > > at > > > > > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195) > > > > > > at > > > > > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651) > > > > > > at > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766) > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) > > > > > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255) > > > > > > at java.security.AccessController.doPrivileged(Native Method) > > > > > > at javax.security.auth.Subject.doAs(Subject.java:416) > > > > > > at > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) > > > > > > at org.apache.hadoop.mapred.Child.main(Child.java:249) > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect > > > syntax near ','. > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340) > > > > > > at > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154) > > > > > > at > > > > > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322) > > > > > > at > > > > > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234) > > > > > > > > > > > > The HDFS file that we want to export was imported using sqoop from SQL > > 2005 > > > before and uses ‘|’ as field delimiter, and there are commas (‘,’) in a > > > field of a line in the file. > > > > > > > > > > > > The commands I submitted is (generalized with capital letters): > > > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D > > > sqoop.export.statements.per.transaction=75 --connect > > > > > "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME" > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir > > > /EXPORT/FROM/DIRECTORY > > > > > > > > > > > > I’ve adjusted values of sqoop.export.records.per.statement & > > > sqoop.export.statements.per.transaction, but that didn’t help. > > > > > > > > > > > > It will be greatly appreciated if you can offer some help. Thanks. > > > > > > > > > > > > Ivan > >
signature.asc
Description: Digital signature
