Great
Thanks for the suggestion

On Mon, 3 Jul, 2023, 11:23 pm Thomas Kramer, <[email protected]> wrote:

> Why can you then not do the same for the Ignite nodes? You create a
> start-ignite-nodes.sh script that has all the host addresses and then run a
> command like "ssh <host-address> start-ignite.sh" for all host ip
> addresses. This of course is most efficient if you have exchanged ssh keys
> between the servers so you don't need to login.
>
>
> On 03.07.23 18:43, Arunima Barik wrote:
>
> We are not using Docker currently.
>
> The java implementation starts only 1 node right. I require many nodes and
> that too on different hosts.
>
> Regarding how I start the Spark workers.. So I run the start-workers.sh
> script and I have already defined all host addresses in workers config
> file.
>
> On Mon, 3 Jul, 2023, 8:51 pm Gianluca Bonetti, <[email protected]>
> wrote:
>
>> Hello Arunima
>>
>> I suppose you run Spark in containers so you can create a custom Docker
>> image to start Apache Ignite.
>>
>> You can also start the server nodes programmatically by Java code.
>>
>> Cheers
>> Gianluca
>>
>> On Mon, 3 Jul 2023 at 16:03, Arunima Barik <[email protected]>
>> wrote:
>>
>>> I have around 20 spark worker nodes running on different hosts
>>>
>>> I need to start an Ignite node on every such spark worker
>>>
>>> I know that we can ssh into that host and run ignite.sh script to start
>>> the node
>>>
>>> Is there a simpler way to do so ..
>>> Without having to ssh into 20 hosts manually??
>>>
>>> Regards
>>> Arunima
>>>
>>

Reply via email to