Hi Madhanmohan,
would you mind sharing entire exception that you are getting including the 
stack trace? Please note that you might need to enable verbose mode on client 
side in order to show entire exception stack trace. You can do that with 
following command:

  set option --name verbose --value true

Jarcec

On Tue, Jul 09, 2013 at 07:26:28PM +0530, Madhanmohan Savadamuthu wrote:
> After updated configuration for Job Tracker, I am getting error that
> connection is refused when I submit a job.
> 
> 1) I created a connection
> 2) Create import job
> 3) Tried submission of the Job. I am getting error message as connection
> refused.
> 
> Is it due to invalid configuration of Job Tracker port number?
> 
> Regards,
> Madhan
> On Sat, Jul 6, 2013 at 12:34 AM, Mengwei Ding 
> <[email protected]>wrote:
> 
> > Hi Madhan,
> >
> > It's really great to hear that the problem is clear. Enjoy Sqoop2. If you
> > have further question, I will be more than happy to answer.
> >
> > Have a nice day!
> >
> > Best,
> > Mengwei
> >
> >
> > On Fri, Jul 5, 2013 at 11:42 AM, Madhanmohan Savadamuthu <
> > [email protected]> wrote:
> >
> >> Mengwei,
> >>
> >> Issue is solved. MapReduce configuration file had invalid configuration
> >> and was creating problem.
> >>
> >> *File Name:* /etc/hadoop/conf/mapred-site.xml
> >> *Parameter Name:* mapred.job.tracker
> >> *Original Value: *neededForHive:999999
> >> *Modified Value: *<machinename>:9001
> >>
> >> After this change, this are working fine now. Note that I have followed
> >> suggestions provided by Mengwei in this thread.
> >>
> >> Regards,
> >> Madhan
> >>
> >>  On Thu, Jul 4, 2013 at 11:08 PM, Mengwei Ding <[email protected]
> >> > wrote:
> >>
> >>> Ok, Madhan, why not.
> >>>
> >>> Could you kindly provide you availabilities of time and communication
> >>> tools. I will be more than happy to help you out with this.
> >>>
> >>> Best,
> >>> Mengwei
> >>>
> >>>
> >>> On Thu, Jul 4, 2013 at 1:15 AM, Madhanmohan Savadamuthu <
> >>> [email protected]> wrote:
> >>>
> >>>> After doing changes in catalina.properties also, same issue is coming.
> >>>>
> >>>> is there any possibility for interactive discussion on this issue?
> >>>>
> >>>> Regards,
> >>>> Madhan
> >>>>
> >>>>  On Wed, Jul 3, 2013 at 11:05 PM, Mengwei Ding <
> >>>> [email protected]> wrote:
> >>>>
> >>>>> Thank you for your prompt response, sir. Please don't worry, I can
> >>>>> help you out with this until your problem is done.
> >>>>>
> >>>>> Well, let's try out our new method of adding dependency jar files, and
> >>>>> forget about the addtowar.sh script.
> >>>>>
> >>>>> Please following these instructions:
> >>>>>
> >>>>>  "
> >>>>> Installing Dependencies
> >>>>>
> >>>>> Hadoop libraries must be available on node where you are planning to
> >>>>> run Sqoop server with proper configuration for major services - NameNode
> >>>>> and either JobTracker or ResourceManager depending whether you are 
> >>>>> running
> >>>>> Hadoop 1 or 2. There is no need to run any Hadoop service on the same 
> >>>>> node
> >>>>> as Sqoop server, just the libraries and configuration must be available.
> >>>>>
> >>>>> Path to Hadoop libraries is stored in file catalina.properties inside
> >>>>> directory server/conf. You need to change property called common.loader 
> >>>>> to
> >>>>> contain all directories with your Hadoop libraries. The default expected
> >>>>> locations are /usr/lib/hadoop and /usr/lib/hadoop/lib/. Please check out
> >>>>> the comments in the file for further description how to configure 
> >>>>> different
> >>>>> locations.
> >>>>>
> >>>>> Lastly you might need to install JDBC drivers that are not bundled
> >>>>> with Sqoop because of incompatible licenses. You can add any arbitrary 
> >>>>> Java
> >>>>> jar file to Sqoop server by copying it into lib/ directory. You can 
> >>>>> create
> >>>>> this directory if it do not exists already.
> >>>>> "
> >>>>>
> >>>>> I can give you my configuration as an example. So in my
> >>>>> catalina.properties file, I have the following line:
> >>>>>
> >>>>> *
> >>>>> common.loader=${catalina.base}/lib,${catalina.base}/lib/*.jar,${catalina.home}/lib,${catalina.home}/lib/*.jar,${catalina.home}/../lib/*.jar,/usr/lib/hadoop/client-0.20/*.jar,/home/mengweid/Downloads/mysql-connector-java-5.1.25-bin.jar
> >>>>> *
> >>>>>
> >>>>> The */usr/lib/hadoop/client-0.20/*.jar *is used to include all
> >>>>> hadoop-related jars, and *mysql-connector-java-5.1.25-bin.jar *is
> >>>>> used for JDBC driver.
> >>>>>
> >>>>> Please try this, and let me know whether it works. Thank you.
> >>>>>
> >>>>> Best,
> >>>>> Mengwei
> >>>>>
> >>>>>
> >>>>> On Wed, Jul 3, 2013 at 9:18 AM, Madhanmohan Savadamuthu <
> >>>>> [email protected]> wrote:
> >>>>>
> >>>>>> I did deployment as sugggested in below thread. I am not able to
> >>>>>> successfully use sqoop2. I am attaching the services log for your
> >>>>>> references.
> >>>>>>
> >>>>>> I made sure that exact same set of JAR files in appropriate location
> >>>>>> and also deleted sqoop folder before starting the sqoop server.
> >>>>>>
> >>>>>> *Error Message:*
> >>>>>>  Exception has occurred during processing command
> >>>>>> Exception: com.sun.jersey.api.client.UniformInterfaceException
> >>>>>> Message: GET http://<ipaddress>:12013/sqoop/version returned a
> >>>>>> response status of 404 Not Found
> >>>>>>
> >>>>>> Regards,
> >>>>>> Madhan
> >>>>>>
> >>>>>>  On Wed, Jul 3, 2013 at 7:30 PM, Mengwei Ding <
> >>>>>> [email protected]> wrote:
> >>>>>>
> >>>>>>> Hi Madhanmohan,
> >>>>>>>
> >>>>>>> Thank you for providing all these detailed information. Help a lot
> >>>>>>> to diagnose the problem.
> >>>>>>>
> >>>>>>> First, the addtowar.sh is not good enough for every situation,
> >>>>>>> we apologize for that. We have already figured out a new way to add
> >>>>>>> dependency library, which will coming out along with next version of 
> >>>>>>> Sqoop2.
> >>>>>>>
> >>>>>>> Currently, it seems like the hadoop-core.jar has not been added. I
> >>>>>>> could show you all the libraries existing in the
> >>>>>>> webapps/sqoop/WEB-INF/lib folder, please check below:
> >>>>>>>  avro-1.7.4.jar
> >>>>>>>  commons-cli-1.2.jar
> >>>>>>> commons-configuration-1.6.jar
> >>>>>>> commons-dbcp-1.4.jar
> >>>>>>> commons-lang-2.5.jar
> >>>>>>> commons-logging-1.1.1.jar
> >>>>>>> commons-pool-1.5.4.jar
> >>>>>>> derby-10.8.2.2.jar
> >>>>>>> guava-11.0.2.jar
> >>>>>>> hadoop-auth-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-common-2.0.0-cdh4.3.0.jar
> >>>>>>> *hadoop-core-2.0.0-mr1-cdh4.3.0.jar*
> >>>>>>> hadoop-hdfs-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-mapreduce-client-app-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-mapreduce-client-common-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-mapreduce-client-core-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-mapreduce-client-jobclient-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-yarn-api-2.0.0-cdh4.3.0.jar
> >>>>>>> hadoop-yarn-common-2.0.0-cdh4.3.0.jar
> >>>>>>>  jackson-core-asl-1.8.8.jar
> >>>>>>> jackson-mapper-asl-1.8.8.jar
> >>>>>>> json-simple-1.1.jar
> >>>>>>> log4j-1.2.16.jar
> >>>>>>> mysql-connector-java-5.1.25-bin.jar
> >>>>>>> protobuf-java-2.4.0a.jar
> >>>>>>> slf4j-api-1.6.1.jar
> >>>>>>> slf4j-log4j12-1.6.1.jar
> >>>>>>> sqoop-common-1.99.2.jar
> >>>>>>> sqoop-connector-generic-jdbc-1.99.2.jar
> >>>>>>> sqoop-core-1.99.2.jar
> >>>>>>> sqoop-execution-mapreduce-1.99.2-hadoop200.jar
> >>>>>>> sqoop-repository-derby-1.99.2.jar
> >>>>>>> sqoop-spi-1.99.2.jar
> >>>>>>> sqoop-submission-mapreduce-1.99.2-hadoop200.jar
> >>>>>>>
> >>>>>>> I have the same Hadoop and Sqoop2 installation directories with you.
> >>>>>>> And I am running a pseudo cluster in a single Ubuntu virtual machine.
> >>>>>>>
> >>>>>>> So, now, you could try to add the hadoop-core.jar manually, and then
> >>>>>>> go ahead to see whether the sqoop2 server could run. Please follow the
> >>>>>>> following steps:
> >>>>>>>
> >>>>>>> *./bin/addtowar.sh
> >>>>>>> -jars 
> >>>>>>> /usr/lib/hadoop-0.20-mapreduce/hadoop-core-2.0.0-mr1-cdh4.3.0.jar
> >>>>>>> *
> >>>>>>>
> >>>>>>> Please find the hadoop-core.jar in your own machine. It should be in
> >>>>>>> a similar place. But still, if you have problem, please let me know.
> >>>>>>>
> >>>>>>>
> >>>>>>> The reason why it's better to remove the "sqoop" folder is to clear
> >>>>>>> the cached old servlet. Because Tomcat cannot alway extract the 
> >>>>>>> sqoop.war
> >>>>>>> file immediately after you add dependency library to sqoop.war file. 
> >>>>>>> By
> >>>>>>> removing the sqoop folder, the Tomcat is forced to extract the 
> >>>>>>> sqoop.war to
> >>>>>>> keep the sqoop folder up-to-date. So in this way, you could know 
> >>>>>>> whether
> >>>>>>> you have correctly setup the dependency library. Does this explanation
> >>>>>>> help?
> >>>>>>>
> >>>>>>> Best,
> >>>>>>> Mengwei
> >>>>>>>
> >>>>>>>
> >>>>>>> On Tue, Jul 2, 2013 at 9:19 PM, Madhanmohan Savadamuthu <
> >>>>>>> [email protected]> wrote:
> >>>>>>>
> >>>>>>>> Hi  Mengwei,
> >>>>>>>>
> >>>>>>>> Following are details
> >>>>>>>>
> >>>>>>>> Hadoop Version: Hadoop 2.0.0-cdh4.2.1
> >>>>>>>> Linux Version: Linux version 2.6.32-358.2.1.el6.x86_64 (
> >>>>>>>> [email protected]) (gcc version 4.4.7
> >>>>>>>> 20120313 (Red Hat 4.4.7-3) (GCC) ) #1 SMP Wed Feb 20 12:17:37 EST 
> >>>>>>>> 2013
> >>>>>>>> Hadoop Installation Location: /usr/lib/hadoop
> >>>>>>>> Sqoop2 Installation Location: /usr/lib/sqoop2
> >>>>>>>> Sqoop2 Dependency Configuration  Command Used: ./bin/addtowar.sh
> >>>>>>>> -hadoop-auto
> >>>>>>>> Files in :
> >>>>>>>>
> >>>>>>>> avro-1.7.3.jar
> >>>>>>>> commons-cli-1.2.jar
> >>>>>>>> commons-configuration-1.6.jar
> >>>>>>>> commons-dbcp-1.4.jar
> >>>>>>>> commons-lang-2.5.jar
> >>>>>>>> commons-logging-1.1.1.jar
> >>>>>>>> commons-pool-1.5.4.jar
> >>>>>>>> derby-10.8.2.2.jar
> >>>>>>>> guava-11.0.2.jar
> >>>>>>>> hadoop-auth-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-common-2.0.0-cdh4.2.1-tests.jar
> >>>>>>>> hadoop-hdfs-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-yarn-api-2.0.0-cdh4.2.1.jar
> >>>>>>>> hadoop-yarn-common-2.0.0-cdh4.2.1.jar
> >>>>>>>> jackson-core-asl-1.8.8.jar
> >>>>>>>> jackson-mapper-asl-1.8.8.jar
> >>>>>>>> json-simple-1.1.jar
> >>>>>>>> log4j-1.2.16.jar
> >>>>>>>> mysql-connector-java-5.1.25-bin.jar
> >>>>>>>> protobuf-java-2.4.0a.jar
> >>>>>>>> slf4j-api-1.6.1.jar
> >>>>>>>> slf4j-log4j12-1.6.1.jar
> >>>>>>>> sqoop-common-1.99.2.jar
> >>>>>>>> sqoop-connector-generic-jdbc-1.99.2.jar
> >>>>>>>> sqoop-core-1.99.2.jar
> >>>>>>>> sqoop-execution-mapreduce-1.99.2-hadoop200.jar
> >>>>>>>> sqoop-repository-derby-1.99.2.jar
> >>>>>>>> sqoop-spi-1.99.2.jar
> >>>>>>>> sqoop-submission-mapreduce-1.99.2-hadoop200.jar
> >>>>>>>>
> >>>>>>>> Can you elaborate more about deletion of 'sqoop' folder?
> >>>>>>>>
> >>>>>>>> Regards,
> >>>>>>>> Madhanmohan S
> >>>>>>>>
> >>>>>>>>  On Tue, Jul 2, 2013 at 10:50 PM, Mengwei Ding <
> >>>>>>>> [email protected]> wrote:
> >>>>>>>>
> >>>>>>>>> Hi Madhanmohan,
> >>>>>>>>>
> >>>>>>>>> Thank you for your interest in Sqoop2. It's really great to hear
> >>>>>>>>> this. And thank you for providing details for your question. Let me 
> >>>>>>>>> help
> >>>>>>>>> you out with this.
> >>>>>>>>>
> >>>>>>>>> This main reason for your situation is that the Sqoop servlet has
> >>>>>>>>> not been started successfully, so the client get connection 
> >>>>>>>>> refused. I have
> >>>>>>>>> gone through you attachments. The reason of servlet failure is that 
> >>>>>>>>> your
> >>>>>>>>> Hadoop dependency library has not be configured correctly. Could 
> >>>>>>>>> you kindly
> >>>>>>>>> answer my following questions, so that I could help with you 
> >>>>>>>>> further.
> >>>>>>>>>
> >>>>>>>>> 1. Your Hadoop version and installation location? You operating
> >>>>>>>>> system?
> >>>>>>>>> 2. The details of how you configure the dependency library for
> >>>>>>>>> sqoop?
> >>>>>>>>> 3. Could you kindly go to
> >>>>>>>>> [sqoop_install_dir]/server/server/webapps/sqoop/WEB-INF/lib and 
> >>>>>>>>> list all
> >>>>>>>>> the jar files?
> >>>>>>>>>
> >>>>>>>>> PS: remember to delete the sqoop folder under
> >>>>>>>>> server/server/webapps every time after you configure the dependency 
> >>>>>>>>> library.
> >>>>>>>>>
> >>>>>>>>> Best,
> >>>>>>>>> Mengwei
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>  On Tue, Jul 2, 2013 at 10:05 AM, Madhanmohan Savadamuthu <
> >>>>>>>>> [email protected]> wrote:
> >>>>>>>>>
> >>>>>>>>>>  I have setup Sqoop 1.99.2 as mentioned in 
> >>>>>>>>>> sqoop.apache.orginstruction. When I try to show version --all 
> >>>>>>>>>> command, following error is
> >>>>>>>>>> coming.
> >>>>>>>>>>
> >>>>>>>>>> Sqoop 1.99.2 revision 3e31b7d3eefb3696d4970704364dea05a9ea2a59
> >>>>>>>>>>   Compiled by homeuser on Mon Apr 15 20:50:13 PDT 2013
> >>>>>>>>>> Exception has occurred during processing command
> >>>>>>>>>> Exception: com.sun.jersey.api.client.ClientHandlerException
> >>>>>>>>>> Message: java.net.ConnectException: Connection refused
> >>>>>>>>>>
> >>>>>>>>>> all log files are attached for reference.
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> --
> >>>>>>>>>> Thanks and Regards,
> >>>>>>>>>> Madhanmohan S
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> --
> >>>>>>>> Thanks and Regards,
> >>>>>>>> Madhanmohan S
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>>
> >>>>>> --
> >>>>>> Thanks and Regards,
> >>>>>> Madhanmohan S
> >>>>>>
> >>>>>
> >>>>>
> >>>>
> >>>>
> >>>> --
> >>>> Thanks and Regards,
> >>>> Madhanmohan S
> >>>>
> >>>
> >>>
> >>
> >>
> >> --
> >> Thanks and Regards,
> >> Madhanmohan S
> >>
> >
> >
> 
> 
> -- 
> Thanks and Regards,
> Madhanmohan S

Attachment: signature.asc
Description: Digital signature

Reply via email to