Re: impala building process

2018-04-12 Thread Fredy Wijaya
bootstrap_development.sh is meant for doing Impala development. If you just
want to build Impala, you can simply run bootstrap_build.sh (
https://github.com/apache/impala/blob/master/bin/bootstrap_build.sh).

*Fredy Wijaya* | Software Engineer
e. fwij...@cloudera.com
cloudera.com <https://www.cloudera.com>

[image: Cloudera] <https://www.cloudera.com/>

[image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
Cloudera on Facebook] <https://www.facebook.com/cloudera> [image: Cloudera
on LinkedIn] <https://www.linkedin.com/company/cloudera>
--

On Thu, Apr 12, 2018 at 3:53 PM, Philip Zeyliger 
wrote:

> I think https://cwiki.apache.org/confluence/display/IMPALA/Building+Impala,
> which you've already seen, is the main guide.
>
> -- Philip
>
> On Tue, Apr 10, 2018 at 3:01 AM, Nikitin Artem 
> wrote:
>
>> Hello, thank you for the answer. Is there are any guide about building
>> impala (not whole cluster)?
>>
>>
>>
>> *From:* Philip Zeyliger [mailto:phi...@cloudera.com]
>> *Sent:* Monday, April 9, 2018 8:44 PM
>> *To:* user@impala.apache.org
>> *Subject:* Re: impala building process
>>
>>
>>
>> Hi Nikitin,
>>
>>
>>
>> When Impala builds, it depends on Hadoop. More specifically, it links (in
>> a C++ sense) against 'libhdfs.so' and 'libhadoop.so' to enable reading from
>> HDFS during query execution. (It also does this for Kudu.) On the frontend,
>> it uses Maven to grab dependencies (especially HDFS, Hive, Sentry, HBase,
>> and Kudu) to compile the relevant Java code.
>>
>>
>>
>> If you're building Impala from scratch, you'll want to build Impala
>> against the same versions of the components as you're running in your
>> cluster.
>>
>>
>>
>> -- Philip
>>
>>
>>
>>
>>
>> On Mon, Apr 9, 2018 at 9:53 AM, Nikitin Artem 
>> wrote:
>>
>> Hi all. I’m trying to add impala to existing hadoop cluster. I’m using
>> the instructions described at https://cwiki.apache.org/confl
>> uence/display/IMPALA/Building+Impala but I don’t understand, why these
>> scripts downloading Hadoop cluster components (I’m already have a cluster).
>> Please help.
>>
>>
>>
>
>


Re: impala building process

2018-04-12 Thread Philip Zeyliger
I think https://cwiki.apache.org/confluence/display/IMPALA/Building+Impala,
which you've already seen, is the main guide.

-- Philip

On Tue, Apr 10, 2018 at 3:01 AM, Nikitin Artem 
wrote:

> Hello, thank you for the answer. Is there are any guide about building
> impala (not whole cluster)?
>
>
>
> *From:* Philip Zeyliger [mailto:phi...@cloudera.com]
> *Sent:* Monday, April 9, 2018 8:44 PM
> *To:* user@impala.apache.org
> *Subject:* Re: impala building process
>
>
>
> Hi Nikitin,
>
>
>
> When Impala builds, it depends on Hadoop. More specifically, it links (in
> a C++ sense) against 'libhdfs.so' and 'libhadoop.so' to enable reading from
> HDFS during query execution. (It also does this for Kudu.) On the frontend,
> it uses Maven to grab dependencies (especially HDFS, Hive, Sentry, HBase,
> and Kudu) to compile the relevant Java code.
>
>
>
> If you're building Impala from scratch, you'll want to build Impala
> against the same versions of the components as you're running in your
> cluster.
>
>
>
> -- Philip
>
>
>
>
>
> On Mon, Apr 9, 2018 at 9:53 AM, Nikitin Artem 
> wrote:
>
> Hi all. I’m trying to add impala to existing hadoop cluster. I’m using the
> instructions described at https://cwiki.apache.org/
> confluence/display/IMPALA/Building+Impala but I don’t understand, why
> these scripts downloading Hadoop cluster components (I’m already have a
> cluster). Please help.
>
>
>


RE: impala building process

2018-04-10 Thread Nikitin Artem
Hello, thank you for the answer. Is there are any guide about building impala 
(not whole cluster)?

From: Philip Zeyliger [mailto:phi...@cloudera.com]
Sent: Monday, April 9, 2018 8:44 PM
To: user@impala.apache.org
Subject: Re: impala building process

Hi Nikitin,

When Impala builds, it depends on Hadoop. More specifically, it links (in a C++ 
sense) against 'libhdfs.so' and 'libhadoop.so' to enable reading from HDFS 
during query execution. (It also does this for Kudu.) On the frontend, it uses 
Maven to grab dependencies (especially HDFS, Hive, Sentry, HBase, and Kudu) to 
compile the relevant Java code.

If you're building Impala from scratch, you'll want to build Impala against the 
same versions of the components as you're running in your cluster.

-- Philip


On Mon, Apr 9, 2018 at 9:53 AM, Nikitin Artem 
mailto:artem.niki...@advlab.io>> wrote:
Hi all. I’m trying to add impala to existing hadoop cluster. I’m using the 
instructions described at 
https://cwiki.apache.org/confluence/display/IMPALA/Building+Impala but I don’t 
understand, why these scripts downloading Hadoop cluster components (I’m 
already have a cluster). Please help.



Re: impala building process

2018-04-09 Thread Philip Zeyliger
Hi Nikitin,

When Impala builds, it depends on Hadoop. More specifically, it links (in a
C++ sense) against 'libhdfs.so' and 'libhadoop.so' to enable reading from
HDFS during query execution. (It also does this for Kudu.) On the frontend,
it uses Maven to grab dependencies (especially HDFS, Hive, Sentry, HBase,
and Kudu) to compile the relevant Java code.

If you're building Impala from scratch, you'll want to build Impala against
the same versions of the components as you're running in your cluster.

-- Philip


On Mon, Apr 9, 2018 at 9:53 AM, Nikitin Artem 
wrote:

> Hi all. I’m trying to add impala to existing hadoop cluster. I’m using the
> instructions described at https://cwiki.apache.org/
> confluence/display/IMPALA/Building+Impala but I don’t understand, why
> these scripts downloading Hadoop cluster components (I’m already have a
> cluster). Please help.
>