is this necessary to run spark,map reduce jobs from oozie?

On Fri, Aug 28, 2015 at 11:23 AM, Jeetendra G <[email protected]>
wrote:

> when I goto Admin->version I find this HDP-2.3.0.0-2557.
>
>
>
>
>
> On Thu, Aug 27, 2015 at 11:36 PM, Alejandro Fernandez <
> [email protected]> wrote:
>
>> Hi Jitendra,
>>
>> What version of Ambari and HDP are you running?
>> You just need to install Oozie server on any host, and pick the hosts for
>> the clients.
>> In HDP 2.3, it's possible to have multiple Oozie servers for High
>> Availability.
>>
>> HDP binaries are in /usr/hdp/current/spark-server/bin
>> Note that /usr/hdp/current/spark-server is a symlink to
>> /usr/hdp/2.#.#.#-####/spark
>>
>> Thanks,
>> Alejandro
>>
>> From: Jeetendra G <[email protected]>
>> Reply-To: "[email protected]" <[email protected]>
>> Date: Thursday, August 27, 2015 at 4:06 AM
>> To: "[email protected]" <[email protected]>
>> Subject: Running spark and map reduce jobs
>>
>> Hi All I have installed Ambari and with Ambari I have installed
>> hadoop,Spark,hive,oozie.
>>  When I was installing oozie it was asking me where all you need Ooozie
>> in my cluster means in how many Nodes?
>> I am not really able to understand why its asking what all nodes you want
>> to install Oozie. rather it should install in any one Node?
>>
>>
>> Also how can I run my map reduce and spark jobs?
>>
>> Where does Ambari installed the binary of the installed packages in /bin?
>>
>>
>> Regards
>> Jeetendra
>>
>
>

Reply via email to