On Mon, Oct 26, 2009 at 3:27 AM, Carl Steinbach <[email protected]> wrote: > Hi Rahul, > > Please follow these steps: > > 1) In your hive source directory run 'ant clean'. > 2) remove the contents of ~/.ant/cache/hadoop/core/sources > 3) Download the following files to ~/.ant/cache/hadoop/core/sources: > hadoop-0.17.2.1.tar.gz > hadoop-0.17.2.1.tar.gz.asc > hadoop-0.18.3.tar.gz > hadoop-0.18.3.tar.gz.asc > hadoop-0.19.0.tar.gz > hadoop-0.19.0.tar.gz.asc > hadoop-0.20.0.tar.gz > hadoop-0.20.0.tar.gz.asc > > 4) For each hadoop-xxx.tar.gz file, compute the sha1 checksum using sha1sum, > and verify that it matches the sha1 checksum in the corresponding .asc file. > > If it does not match then the file is corrupt and you need to try > downloading it again. > > 5) Try building Hive again following the instructions on the wiki. You > shouldn't have any problems if you verified the checksums. > > As an additional note, if you don't care about support for Hadoop 0.17.2.1, > or 0.18, etc, you can disable support for these versions (and skip the > download) by removing the references to these versions that shims/build.xml > and shims/ivy.xml > >> >> Also I want to use hive on top of current working hadoop cluster. >> Please provide some inputs. > > You need to set HADOOP_HOME and make sure that HADOOP_HOME/conf (or > HADOOP_CONF_DIR if you have this variable set) contains valid configuration > files for your current working hadoop cluster. See the following page for > more information: http://wiki.apache.org/hadoop/GettingStartedWithHadoop > > Thanks. > > Carl > > On Sun, Oct 25, 2009 at 11:13 PM, Rahul Pal <[email protected]> wrote: >> >> Its still not working Carl. >> First it tries to download the hadoop-0.17.2.1.tar.gz and then try the >> upper versions of hadoop. >> Somehow build script unable to download the hadoop packages and shows >> mismatch in size >> error ( Downloaded file size doesn't match expected Content Length for >> http://archive.apache.org/dist/hadoop/core/hadoop-0.17.2.1/hadoop-0.17.2.1.tar.gz.). > > >> >> Also I want to use hive on top of current working hadoop cluster. >> Please provide some inputs. > > >> >> Thanks & Regards >> Rahul >> >> >> On Fri, Oct 23, 2009 at 1:29 PM, Carl Steinbach <[email protected]> wrote: >>> >>> Hi Rahul, >>> >>> One solution is to manually download the files to >>> ~/.ant/cache/hadoop/core/sources/ >>> >>> This should prevent ivy from trying (and failing) to download them >>> itself. >>> >>> Carl >>> >>> On Thu, Oct 22, 2009 at 10:41 PM, Rahul Pal <[email protected]> wrote: >>>> >>>> Hi guyz, >>>> I am trying to build Hive from >>>> the trunk - not sure whether I'll be able to do it or not - because >>>> every time I tried that, the build process started downloading all >>>> versions of hadoop and failed with an error saying that the downloaded >>>> size didn't match the expected size... >>>> Please provide some input. >>>> >>>> Thanks & Regards >>>> Rahul Pal >>>> Software Engineer - Discover >>>> One97 Communications (P) Ltd >>>> B121, Sector 5, Noida, UP 201301 >>>> >>>> P: + 91 120 4770770 Extn:312 >>>> M: + 91 9873005998 >>>> W: www.one97world.com >>>> >>>> >>> >> > >
All, On a related note this is the second person with this checksum issue. I am not seeing the problem myself but I have not updated from trunk recently. Does anyone know what the fix, not the work around is? Edward
