Hey John any idea what exactly i should verify on this configurations? Yeah, i'm running locally on my computer.
Thank you. On Wed, Mar 12, 2014 at 9:38 AM, John Zhao <[email protected]> wrote: > No, you don not need manually copy the jar files. > Usualy this happens when you run in MR local mode with yarn. Check your > hadoop setting or sqoop setting to make sure you get the correct job > tracker. > > John. > > On Mar 12, 2014, at 9:05 AM, Kleiton Silva <[email protected]> > wrote: > > > Hello my friends, > > > > I have some doubt about sqoop and i hope you can you help me. > > > > I am try import one table from mysql with two columns. when a try > execute the import with the following command: > > > > start job --jid 2 > > > > I get this error: > > > > 2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT > > Exception: java.io.FileNotFoundException: File does not exist: > hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar > > > > Command that i've had to do before this error: > > > > hdfs dfs -mkdir /usr/lib/sqoop/lib > > hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib > > > > hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib > > hdfs dfs -copyFromLocal > /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar > /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib > > > > > > Is really necessary copy all jars to HDFS or there is another smart > solution? > > > > > > > > Thank you. > > > > Kleiton Silva > > > >
