Hi Sean, please do not hijack other people’s threads. If you have announcement to make or an issue to solve, start a new thread instead.
To your point about adding JDBC drivers to Sqoop installation, there is clear documentation how to add them and it seems to be working out for all other Sqoop users. Jarcec On Jul 24, 2014, at 9:04 PM, Sean Franks <[email protected]> wrote: > FYI: anybody listening in on this string of messages ... > > I am SICK AND TIRED OF STRUGGLING TO MAKE SQOOP/poop WORK!!!!! > > I've started working on a replacement for sqoop that is non-denominational > and therefore will have no problem with the driver issues that plague > everybody from every corners of the world. I had my international > Intellectual-Property lawyer girlfriend look into my ability to distribute > the licenses and easily plug them in to an interface, and there will be no > prob. > > So, as soon as I get this poop done, we can say goodbye to sqoop and say > hello to something new. Which of course will be free. > > > Sean Franks | (212) 284-8787 > “With the addition of a well structured Big Data ecosystem to the Data > Highway of an enterprise, Business Intelligence analytics will take a quantum > leap forward.” – Sean Franks, 2014 > > > -----Original Message----- > From: Jarek Jarcec Cecho [mailto:[email protected]] On Behalf Of Jarek Jarcec > Cecho > Sent: Thursday, July 24, 2014 10:58 PM > To: [email protected] > Subject: Re: Question about submit patches > > Hi Guodong, > I’m glad to hear your interest in contributing to Sqoop! You should > definitely submit patch that is build against trunk branch. > > I’ve run the same command and it worked for me. Could you just try “ant clean > test” without any other parameters? > > Jarcec > > On Jul 24, 2014, at 7:39 PM, 王国栋 <[email protected]> wrote: > >> Hi guys, >> >> I have a patch for this issue >> https://issues.apache.org/jira/browse/SQOOP-1368. >> >> But I don't know which branch i should use to test my patch before >> submitting to code review. >> >> Currently, I have tested the patch in branch-1.4.2. But when I tried to >> apply the patch to trunk, I met some building issues. Looks like the ivy >> setting is not right. But I am not good at ivy, please help. >> >> I used this command to build >> ant clean jar -Divy.version=2.2.0 -Dhadoopversion=200 >> >> The following error message is like this >> >> [ivy:resolve] org.apache.zookeeper#zookeeper;3.4.2 by >> [org.apache.zookeeper#zookeeper;3.4.3] in [hadoop200] >> --------------------------------------------------------------------- >> | | modules || artifacts | >> | conf | number| search|dwnlded|evicted|| number|dwnlded| >> --------------------------------------------------------------------- >> | hadoop200 | 151 | 55 | 0 | 26 || 125 | 0 | >> --------------------------------------------------------------------- >> [ivy:resolve] >> [ivy:resolve] :: problems summary :: >> [ivy:resolve] :::: WARNINGS >> [ivy:resolve] :::::::::::::::::::::::::::::::::::::::::::::: >> [ivy:resolve] :: UNRESOLVED DEPENDENCIES :: >> [ivy:resolve] :::::::::::::::::::::::::::::::::::::::::::::: >> [ivy:resolve] :: org.apache.hive#hive-serde;0.11.0: >> configuration not public in org.apache.hive#hive-serde;0.11.0: 'compile'. It >> was required from org.apache.hive#hive-metastore;0.11.0 compile >> [ivy:resolve] :: org.apache.hive#hive-shims;0.11.0: >> configuration not public in org.apache.hive#hive-shims;0.11.0: 'compile'. It >> was required from org.apache.hive#hive-cli;0.11.0 compile >> [ivy:resolve] :: org.apache.hive#hive-common;0.11.0: >> configuration not public in org.apache.hive#hive-common;0.11.0: 'compile'. >> It was required from org.apache.hcatalog#hcatalog-core;0.11.0 compile >> [ivy:resolve] :::::::::::::::::::::::::::::::::::::::::::::: >> [ivy:resolve] >> [ivy:resolve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS >> >> BUILD FAILED >> >> Guodong > >
