Thanks Jarec, seems it was a permission issue of one of the trackers, I
have granted access to % on the mysql server and it worked like a charm.

Although I am working on a single node EMR but it seems it uses more than
one IP because part of it is connecting while other IPs are timed out.

Thanks.


--
Ibrahim


On Tue, Jan 15, 2013 at 1:41 PM, Jarek Jarcec Cecho <[email protected]>wrote:

> Hi Ibrahim,
> based on your current exception it seems that you correctly resolved
> previous exception about Hadoop incompatibility.
>
> > java.lang.RuntimeException: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
>
> This exception is raised when MySQL JDBC driver can't create connection to
> your MySQL box. Sqoop requires direct access to your MySQL box not only
> from node where you're executing Sqoop, but also from all TaskTracker nodes
> in your cluster. Would you mind checking that you can connect to your MySQL
> box from all nodes? I would start by checking if the user on MySQL side is
> correctly defined. Additional information can be found in Sqoop
> troubleshooting guide [1].
>
> Jarcec
>
> Links:
> 1:
> http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_mysql_connection_failure
>
> On Tue, Jan 15, 2013 at 01:27:14PM +0300, Ibrahim Yakti wrote:
> > Hello Jarek,
> >
> > I have tried that and I got the following error:
> >
> > 13/01/15 09:49:17 INFO mapred.JobClient: Task Id :
> > attempt_201212160928_0044_m_000000_0, Status : FAILED
> > java.lang.RuntimeException: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
> >
> > The last packet sent successfully to the server was 0 milliseconds ago.
> The
> > driver has not received any packets from the server.
> >         at
> >
> org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
> >         at
> > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
> >         at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> >         at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:730)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.lang.RuntimeException:
> > com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications
> > link failure
> >
> >
> >
> > when I checked the processlist on the MySQL server I got the following:
> >
> > >      Id: 10006
> > >    User: DB_USER
> > >    Host: DB_HOST
> > >      db: DB
> > > Command: Sleep
> > >    Time: 10
> > >   State:
> > >    Info: NULL
> >
> >
> >
> >
> > I have removed all the old sqoop files and downloaded a fresh one, I used
> > "ant clean package -Dhadoopversion=100"
> >
> > moreover I have tried to compile it with -Dhadoopversion=20 &
> > -Dhadoopversion=23 ... still getting errors with no success.
> >
> >
> >
> > --
> > Ibrahim
> >
> >
> > On Tue, Jan 15, 2013 at 9:07 AM, Jarek Jarcec Cecho <[email protected]
> >wrote:
> >
> > > Hi Ibrahim,
> > > how did you compiled the Sqoop? Would you mind trying the following
> > > command?
> > >
> > > ant clean package -Dhadoopversion=100
> > >
> > > * clean - remove any previous compilation outputs, this is to ensure
> that
> > > there are not left any files compiled for different Hadoop version
> > > * package - compile and create package in build/ directory
> > > * -Dhadopversion=100 compile all files for Hadoop 1.0.x
> > >
> > > Jarcec
> > >
> > > On Mon, Jan 14, 2013 at 06:38:36PM +0300, Ibrahim Yakti wrote:
> > > > Hello,
> > > >
> > > > I am trying to install sqoop on EMR instance (Hadoop 1.0.3), I tried
> to
> > > > compile it from source (1.4.2) but I am getting this error:
> > > >
> > > > *Exception in thread "main" java.lang.IncompatibleClassChangeError:
> Found
> > > > class org.apache.hadoop.mapreduce.JobContext, but interface was
> expected*
> > > >
> > > > I did as the FAQ page (
> > > https://cwiki.apache.org/confluence/display/SQOOP/FAQ)
> > > > but still with the same error.
> > > >
> > > > I tried to use another versions and all with the same error.
> > > >
> > > > the only tutorial I was able to fine was this:
> > > >
> > >
> http://blog.kylemulka.com/2012/04/how-to-install-sqoop-on-amazon-elastic-map-reduce-emr/
> > > > but as I said I am not using S3 and I need to run it on the same
> server.
> > > >
> > > >
> > > > Any suggestion how to install it on Amazon EMR?
> > > >
> > > > Thanks.
> > > >
> > > > --
> > > > Ibrahim
> > >
>

Reply via email to