Thanks
but i tried using MYSQL even there i am unable to transfer data
see this
Though it is succesful but records are not transfeered i dont know why

[hduser@master bin]$ sqoop import-all-tables --connect
'jdbc:mysql://<IP>/Event' --username=username --password=Password
Warning: $HADOOP_HOME is deprecated.

12/07/25 18:02:45 WARN tool.BaseSqoopTool: Setting your password on
the command-line is insecure. Consider using -P instead.
12/07/25 18:02:45 INFO manager.MySQLManager: Preparing to use a MySQL
streaming resultset.
12/07/25 18:02:52 INFO tool.CodeGenTool: Beginning code generation
12/07/25 18:02:52 INFO manager.SqlManager: Executing SQL statement:
SELECT t.* FROM `ka_eventlog` AS t LIMIT 1
12/07/25 18:02:53 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/libexec/..
Note: 
/tmp/sqoop-hduser/compile/9f2b8cc1b14c1ba89f4fe5482735b504/ka_eventlog.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/25 18:02:54 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hduser/compile/9f2b8cc1b14c1ba89f4fe5482735b504/ka_eventlog.jar
12/07/25 18:02:54 WARN manager.MySQLManager: It looks like you are
importing from mysql.
12/07/25 18:02:54 WARN manager.MySQLManager: This transfer can be
faster! Use the --direct
12/07/25 18:02:54 WARN manager.MySQLManager: option to exercise a
MySQL-specific fast path.
12/07/25 18:02:54 INFO manager.MySQLManager: Setting zero DATETIME
behavior to convertToNull (mysql)
12/07/25 18:02:58 WARN manager.CatalogQueryManager: The table
ka_eventlog contains a multi-column primary key. Sqoop will default to
the column EventLogID only for this job.
12/07/25 18:02:59 WARN manager.CatalogQueryManager: The table
ka_eventlog contains a multi-column primary key. Sqoop will default to
the column EventLogID only for this job.
12/07/25 18:02:59 INFO mapreduce.ImportJobBase: Beginning import of ka_eventlog
12/07/25 18:03:08 INFO db.DataDrivenDBInputFormat: BoundingValsQuery:
SELECT MIN(`EventLogID`), MAX(`EventLogID`) FROM `ka_eventlog`
12/07/25 18:03:10 INFO mapred.JobClient: Running job: job_201207251610_0001
12/07/25 18:03:11 INFO mapred.JobClient:  map 0% reduce 0%
12/07/25 18:03:40 INFO mapred.JobClient:  map 25% reduce 0%
12/07/25 18:03:43 INFO mapred.JobClient:  map 50% reduce 0%
12/07/25 18:03:47 INFO mapred.JobClient:  map 75% reduce 0%
12/07/25 18:03:53 INFO mapred.JobClient:  map 100% reduce 0%
12/07/25 18:07:27 INFO mapred.JobClient: Job complete: job_201207251610_0001
12/07/25 18:07:27 INFO mapred.JobClient: Counters: 18
12/07/25 18:07:27 INFO mapred.JobClient:   Job Counters
12/07/25 18:07:27 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=898548
12/07/25 18:07:27 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
12/07/25 18:07:27 INFO mapred.JobClient:     Total time spent by all
maps waiting after reserving slots (ms)=0
12/07/25 18:07:27 INFO mapred.JobClient:     Launched map tasks=4
12/07/25 18:07:27 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/07/25 18:07:27 INFO mapred.JobClient:   File Output Format Counters
12/07/25 18:07:27 INFO mapred.JobClient:     Bytes Written=23463783
12/07/25 18:07:27 INFO mapred.JobClient:   FileSystemCounters
12/07/25 18:07:27 INFO mapred.JobClient:     HDFS_BYTES_READ=497
12/07/25 18:07:27 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=128626
12/07/25 18:07:27 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=23463783
12/07/25 18:07:27 INFO mapred.JobClient:   File Input Format Counters
12/07/25 18:07:27 INFO mapred.JobClient:     Bytes Read=0
12/07/25 18:07:27 INFO mapred.JobClient:   Map-Reduce Framework
12/07/25 18:07:27 INFO mapred.JobClient:     Map input records=104306
12/07/25 18:07:27 INFO mapred.JobClient:     Physical memory (bytes)
snapshot=174702592
12/07/25 18:07:27 INFO mapred.JobClient:     Spilled Records=0
12/07/25 18:07:27 INFO mapred.JobClient:     CPU time spent (ms)=12900
12/07/25 18:07:27 INFO mapred.JobClient:     Total committed heap
usage (bytes)=63700992
12/07/25 18:07:27 INFO mapred.JobClient:     Virtual memory (bytes)
snapshot=1507213312
12/07/25 18:07:27 INFO mapred.JobClient:     Map output records=104306
12/07/25 18:07:28 INFO mapred.JobClient:     SPLIT_RAW_BYTES=497
12/07/25 18:07:28 INFO mapreduce.ImportJobBase: Transferred 22.3768 MB
in 267.203 seconds (85.7545 KB/sec)
12/07/25 18:07:28 INFO mapreduce.ImportJobBase: Retrieved 104306 records.
12/07/25 18:07:28 INFO tool.CodeGenTool: Beginning code generation
12/07/25 18:07:33 INFO manager.SqlManager: Executing SQL statement:
SELECT t.* FROM `ka_eventtype` AS t LIMIT 1
12/07/25 18:07:34 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/libexec/..
Note: 
/tmp/sqoop-hduser/compile/9f2b8cc1b14c1ba89f4fe5482735b504/ka_eventtype.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/25 18:07:34 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hduser/compile/9f2b8cc1b14c1ba89f4fe5482735b504/ka_eventtype.jar
12/07/25 18:07:35 INFO mapreduce.ImportJobBase: Beginning import of ka_eventtype
12/07/25 18:07:44 INFO db.DataDrivenDBInputFormat: BoundingValsQuery:
SELECT MIN(`EventTypeID`), MAX(`EventTypeID`) FROM `ka_eventtype`
12/07/25 18:07:45 INFO mapred.JobClient: Running job: job_201207251610_0002
12/07/25 18:07:46 INFO mapred.JobClient:  map 0% reduce 0%
12/07/25 18:08:11 INFO mapred.JobClient:  map 100% reduce 0%
12/07/25 18:08:16 INFO mapred.JobClient: Job complete: job_201207251610_0002
12/07/25 18:08:16 INFO mapred.JobClient: Counters: 17
12/07/25 18:08:16 INFO mapred.JobClient:   Job Counters
12/07/25 18:08:16 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=23530
12/07/25 18:08:16 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
12/07/25 18:08:16 INFO mapred.JobClient:     Total time spent by all
maps waiting after reserving slots (ms)=0
12/07/25 18:08:16 INFO mapred.JobClient:     Launched map tasks=1
12/07/25 18:08:16 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/07/25 18:08:16 INFO mapred.JobClient:   File Output Format Counters
12/07/25 18:08:16 INFO mapred.JobClient:     Bytes Written=0
12/07/25 18:08:16 INFO mapred.JobClient:   FileSystemCounters
12/07/25 18:08:16 INFO mapred.JobClient:     HDFS_BYTES_READ=123
12/07/25 18:08:16 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=32155
12/07/25 18:08:16 INFO mapred.JobClient:   File Input Format Counters
12/07/25 18:08:16 INFO mapred.JobClient:     Bytes Read=0
12/07/25 18:08:16 INFO mapred.JobClient:   Map-Reduce Framework
12/07/25 18:08:16 INFO mapred.JobClient:     Map input records=0
12/07/25 18:08:16 INFO mapred.JobClient:     Physical memory (bytes)
snapshot=39432192
12/07/25 18:08:16 INFO mapred.JobClient:     Spilled Records=0
12/07/25 18:08:16 INFO mapred.JobClient:     CPU time spent (ms)=360
12/07/25 18:08:16 INFO mapred.JobClient:     Total committed heap
usage (bytes)=15925248
12/07/25 18:08:16 INFO mapred.JobClient:     Virtual memory (bytes)
snapshot=373886976
12/07/25 18:08:16 INFO mapred.JobClient:     Map output records=0
12/07/25 18:08:16 INFO mapred.JobClient:     SPLIT_RAW_BYTES=123
12/07/25 18:08:16 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
39.2903 seconds (0 bytes/sec)
12/07/25 18:08:16 INFO mapreduce.ImportJobBase: Retrieved 0 records.

Regards
Prabhjot


On 7/25/12, Cheolsoo Park <[email protected]> wrote:
> Hi Prabhjot,
>
> This is a known incompatibility issue between MS connector and Sqoop 1.4.1.
> In shorts, the problem is that MS connector is compiled against Sqoop 1.3.x
> while you're using it against 1.4.1 at runtime. This issue is resolved by
> SQOOP-480 <https://issues.apache.org/jira/browse/SQOOP-480>, but the fix is
> not in Sqoop 1.4.1.
>
> There are two options that you can choose from:
> 1) Wait for the 1.4.2 release - This is coming soon.
> 2) Download the source tarball, apply the patch, and rebuild the Sqoop jar
> by yourself.
>
> #2 may not sound friendly, but it's not too hard. Regarding how to build
> Sqoop, you can refer to this wiki page:
> https://cwiki.apache.org/confluence/display/SQOOP/Setting+up+Development+Environment
>
> Thanks,
> Cheolsoo
>
> On Tue, Jul 24, 2012 at 11:12 PM, iwannaplay games <
> [email protected]> wrote:
>
>> Hi,
>>
>> I am using sqoop 1.4.1 and after succesfully installing and adding sql
>> server connector i am able to get the list of databases from a server
>> but am not able to import tables.
>>
>> Please see
>> [hduser@master bin]$ sqoop import-all-tables --connect
>> 'jdbc:sqlserver://<IP>;username=dev;password=d3v;database=Content'
>>
>> Warning: $HADOOP_HOME is deprecated.
>>
>> 12/07/25 16:52:40 INFO SqlServer.MSSQLServerManagerFactory: Using
>> Microsoft's SQL Server - Hadoop Connector
>> 12/07/25 16:52:40 INFO manager.SqlManager: Using default fetchSize of
>> 1000
>> 12/07/25 16:52:46 INFO tool.CodeGenTool: Beginning code generation
>> 12/07/25 16:52:46 INFO manager.SqlManager: Executing SQL statement:
>> SELECT TOP 1 * FROM [Batting]
>> 12/07/25 16:52:47 INFO manager.SqlManager: Executing SQL statement:
>> SELECT TOP 1 * FROM [Batting]
>> 12/07/25 16:52:48 INFO orm.CompilationManager: HADOOP_HOME is
>> /usr/local/hadoop/libexec/..
>> Note:
>> /tmp/sqoop-hduser/compile/e80807623f377f6d06f789e49d370a6c/Batting.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 12/07/25 16:52:48 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hduser/compile/e80807623f377f6d06f789e49d370a6c/Batting.jar
>> Exception in thread "main" java.lang.NoSuchMethodError:
>>
>> com.cloudera.sqoop.manager.ImportJobContext.setConnManager(Lcom/cloudera/sqoop/manager/ConnManager;)V
>>         at
>> com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:142)
>>         at
>> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
>>         at
>> org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
>>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
>>
>> Anybody knows the solution
>>
>> Regards
>> Prabhjot
>>
>

Reply via email to