Hello, thank you for the answer. Is there are any guide about building impala 
(not whole cluster)?

From: Philip Zeyliger [mailto:[email protected]]
Sent: Monday, April 9, 2018 8:44 PM
To: [email protected]
Subject: Re: impala building process

Hi Nikitin,

When Impala builds, it depends on Hadoop. More specifically, it links (in a C++ 
sense) against 'libhdfs.so' and 'libhadoop.so' to enable reading from HDFS 
during query execution. (It also does this for Kudu.) On the frontend, it uses 
Maven to grab dependencies (especially HDFS, Hive, Sentry, HBase, and Kudu) to 
compile the relevant Java code.

If you're building Impala from scratch, you'll want to build Impala against the 
same versions of the components as you're running in your cluster.

-- Philip


On Mon, Apr 9, 2018 at 9:53 AM, Nikitin Artem 
<[email protected]<mailto:[email protected]>> wrote:
Hi all. I’m trying to add impala to existing hadoop cluster. I’m using the 
instructions described at 
https://cwiki.apache.org/confluence/display/IMPALA/Building+Impala but I don’t 
understand, why these scripts downloading Hadoop cluster components (I’m 
already have a cluster). Please help.

Reply via email to