heck the docs?
> https://spark.apache.org/docs/latest/spark-standalone.html
>
> On Mon, Nov 8, 2021 at 6:40 AM Dinakar Chennubotla <
> chennu.bigd...@gmail.com> wrote:
>
>> Hi All,i am dinakar and I am admin,i have question,
>> Question: " is it possible to run d
Hi All,i am dinakar and I am admin,i have question,
Question: " is it possible to run distributed spark jobs in Apache spark
standalone cluster".if yes, could someone, help with the docs or webpages.
so that i can create and test it.Thanks in advance,
Dinakar
;
>> Having hadoop implies that you have a YARN resource manager already plus
>> HDFS. YARN is the most widely used resource manager on-premise for Spark.
>>
>> Provide some additional info and we go from there. .
>>
>> HTH
>>
>>
>>
>>
>
or any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On
d out docs?
> https://spark.apache.org/docs/latest/spark-standalone.html
>
> Thanks,
> Khalid
>
> On Sat, Jul 24, 2021 at 1:45 PM Dinakar Chennubotla <
> chennu.bigd...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am Dinakar, Hadoop admin,
>> could someone hel
Hi All,
I am Dinakar, Hadoop admin,
could someone help me here,
1. I have a DEV-POC task to do,
2. Need to Installing Distributed apache-spark cluster with Cluster mode on
Docker containers.
3. with Scalable spark-worker containers.
4. we have a 9 node cluster with some other services or tools.