Thanks Jarcec. I downloaded the binary artifact and it's working now.
I actually built my previous sqoop binary from the sources using the
option -Dhadoopversion=100 (since my hadoop version was 1.0.3) after
reading some blogs on the net. Not sure why it was still giving me that
exception.
Sarath.
On Thursday 25 October 2012 09:35 PM, Jarek Jarcec Cecho wrote:
Hi Sarah,
this exception is very typical when someone is messing together incompatible
hadoop binaries and applications (for example sqoop compiled for hadoop 2
running on hadoop 1). Would you mind checking that you've downloaded
appropriate binary distribution for your cluster? You have to use binary
artifact sqoop-1.4.2.bin__hadoop-1.0.0.tar.gz for hadoop 1.0.3.
Jarcec
On Thu, Oct 25, 2012 at 07:23:49PM +0530, Sarath wrote:
Hi,
I'm new to Sqoop. I have sqoop 1.4.2 with hadoop 1.0.3. I have both
hadoop and sqoop home environment variables set.
I'm trying to export a file on HDFS to a table in Oracle database. I
included all the required parameters inside a file and then ran -
/sqoop --options-file export_params/
I got the below exception -
/Exception in thread "main" java.lang.IncompatibleClassChangeError:
Found class org.apache.hadoop.mapreduce.JobContext, but interface
was expected//
// at
org.apache.sqoop.mapreduce.ExportOutputFormat.checkOutputSpecs(ExportOutputFormat.java:57)//
// at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:887)//
// at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)//
// at java.security.AccessController.doPrivileged(Native Method)//
// ..../
Is there anything more to be configured?
Regards,
Sarath.