Thnx Zheng, it worked. :)

Thanks
Rahul


On Tue, Oct 27, 2009 at 2:23 PM, Zheng Shao <[email protected]> wrote:

> Hi Rahul,
>
> I think you are treating the svn directory as HIVE_HOME. If you do "ant
> package", HIVE_HOME should set to build/dist.
>
> Zheng
>
>
> On Tue, Oct 27, 2009 at 1:19 AM, Rahul Pal <[email protected]> wrote:
>
>> I copied the files (*hadoop-0.19.0.tar.gz and hadoop-0.20.0.tar.gz*) to *
>> ~/.ant/cache/hadoop/core/**sources *directory. * *This time ant script
>> didn't try to download the files, it unzipped the files at *
>> $HIVE_HOME/build/hadoopcore* and the build was successfull.
>>
>> But when it i tried to run the hive through $HIVE_HOME/bin/hive it gave
>> error message (*-bash: bin/hive: Permission denied*). After changing the
>> permission through chmod, it says "*Missing Hive Execution Jar:
>> $HIVE_HOME/lib/hive_exec.jar*".
>>
>> After copying the jar(hive_exec.jar) through *cp 
>> $HIVE_HOME/build/dist/lib/hive_exec.jar
>> lib/ *to lib, it says *Missing Hive MetaStore Jar*
>>
>> Also does i need to change the configs in those unzipped hadoop
>> directory(hadoop-0.20.0, hadoop-0.19.0), when i already have pre-configured
>> hadoop running at different path.
>>
>> Please provide inputs, is there any straight forward way of integrating
>> hive with already present Hadoop.
>>
>> Thanks
>> Rahul
>>
>>
>>
>> On Mon, Oct 26, 2009 at 9:11 PM, Matt Pestritto <[email protected]>wrote:
>>
>>> This also came up in a thread last week.
>>>
>>> Same thing happened to me.
>>>
>>> My temp workaround was:
>>> cd ~/.ant/cache/hadoop/core/sources
>>> wget
>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>
>>> Then rebuild hive.  Ivy will not try to download the source again.
>>>
>>> Thanks
>>> -Matt
>>>
>>>
>>> On Mon, Oct 26, 2009 at 11:09 AM, Edward Capriolo <[email protected]
>>> > wrote:
>>>
>>>> On Mon, Oct 26, 2009 at 3:27 AM, Carl Steinbach <[email protected]>
>>>> wrote:
>>>> > Hi Rahul,
>>>> >
>>>> > Please follow these steps:
>>>> >
>>>> > 1) In your hive source directory run 'ant clean'.
>>>> > 2) remove the contents of ~/.ant/cache/hadoop/core/sources
>>>> > 3) Download the following files to ~/.ant/cache/hadoop/core/sources:
>>>> >       hadoop-0.17.2.1.tar.gz
>>>> >       hadoop-0.17.2.1.tar.gz.asc
>>>> >       hadoop-0.18.3.tar.gz
>>>> >       hadoop-0.18.3.tar.gz.asc
>>>> >       hadoop-0.19.0.tar.gz
>>>> >       hadoop-0.19.0.tar.gz.asc
>>>> >       hadoop-0.20.0.tar.gz
>>>> >       hadoop-0.20.0.tar.gz.asc
>>>> >
>>>> > 4) For each hadoop-xxx.tar.gz file, compute the sha1 checksum using
>>>> sha1sum,
>>>> > and verify that it matches the sha1 checksum in the corresponding .asc
>>>> file.
>>>> >
>>>> > If it does not match then the file is corrupt and you need to try
>>>> > downloading it again.
>>>> >
>>>> > 5) Try building Hive again following the instructions on the wiki. You
>>>> > shouldn't have any problems if you verified the checksums.
>>>> >
>>>> > As an additional note, if you don't care about support for Hadoop
>>>> 0.17.2.1,
>>>> > or 0.18, etc, you can disable support for these versions (and skip the
>>>> > download) by removing the references to these versions that
>>>> shims/build.xml
>>>> > and shims/ivy.xml
>>>> >
>>>> >>
>>>> >> Also I want to use hive on top of current working hadoop cluster.
>>>> >> Please provide some inputs.
>>>> >
>>>> > You need to set HADOOP_HOME and make sure that HADOOP_HOME/conf (or
>>>> > HADOOP_CONF_DIR if you have this variable set) contains valid
>>>> configuration
>>>> > files for your current working hadoop cluster. See the following page
>>>> for
>>>> > more information:
>>>> http://wiki.apache.org/hadoop/GettingStartedWithHadoop
>>>> >
>>>> > Thanks.
>>>> >
>>>> > Carl
>>>> >
>>>> > On Sun, Oct 25, 2009 at 11:13 PM, Rahul Pal <[email protected]>
>>>> wrote:
>>>> >>
>>>> >> Its still not working Carl.
>>>> >> First it tries to download the hadoop-0.17.2.1.tar.gz and then try
>>>> the
>>>> >> upper versions of hadoop.
>>>> >> Somehow build script unable to download the hadoop packages and shows
>>>> >> mismatch in size
>>>> >> error ( Downloaded file size doesn't match expected Content Length
>>>> for
>>>> >>
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.17.2.1/hadoop-0.17.2.1.tar.gz.
>>>> ).
>>>> >
>>>> >
>>>> >>
>>>> >> Also I want to use hive on top of current working hadoop cluster.
>>>> >> Please provide some inputs.
>>>> >
>>>> >
>>>> >>
>>>> >> Thanks & Regards
>>>> >> Rahul
>>>> >>
>>>> >>
>>>> >> On Fri, Oct 23, 2009 at 1:29 PM, Carl Steinbach <[email protected]>
>>>> wrote:
>>>> >>>
>>>> >>> Hi Rahul,
>>>> >>>
>>>> >>> One solution is to manually download the files to
>>>> >>> ~/.ant/cache/hadoop/core/sources/
>>>> >>>
>>>> >>> This should prevent ivy from trying (and failing) to download them
>>>> >>> itself.
>>>> >>>
>>>> >>> Carl
>>>> >>>
>>>> >>> On Thu, Oct 22, 2009 at 10:41 PM, Rahul Pal <[email protected]>
>>>> wrote:
>>>> >>>>
>>>> >>>> Hi guyz,
>>>> >>>> I am trying to build Hive from
>>>> >>>> the trunk - not sure whether I'll be able to do it or not - because
>>>> >>>> every time I tried that, the build process started downloading all
>>>> >>>> versions of hadoop and failed with an error saying that the
>>>> downloaded
>>>> >>>> size didn't match the expected size...
>>>> >>>> Please provide some input.
>>>> >>>>
>>>> >>>> Thanks & Regards
>>>> >>>> Rahul Pal
>>>> >>>> Software Engineer - Discover
>>>> >>>> One97 Communications (P) Ltd
>>>> >>>> B121, Sector 5, Noida, UP 201301
>>>> >>>>
>>>> >>>> P:  + 91 120 4770770      Extn:312
>>>> >>>> M: + 91 9873005998
>>>> >>>> W: www.one97world.com
>>>> >>>>
>>>> >>>>
>>>> >>>
>>>> >>
>>>> >
>>>> >
>>>>
>>>> All,
>>>>
>>>> On a related note this is the second person with this checksum issue.
>>>> I am not seeing the problem myself but I have not updated from trunk
>>>> recently. Does anyone know what the fix, not the work around is?
>>>>
>>>> Edward
>>>>
>>>
>>>
>>
>
>
> --
> Yours,
> Zheng
>

Reply via email to