Hi Maheshakya,
I've fixed this issue. Pls take an update of the carbon-analytics repo
rgds
On Mon, Jul 27, 2015 at 12:12 PM, Maheshakya Wijewardena
mahesha...@wso2.com wrote:
Hi Niranda,
I started those one by one.
On Mon, Jul 27, 2015 at 12:07 PM, Niranda Perera nira...@wso2.com wrote:
Thanks Niranda.
Best regards.
On Mon, Jul 27, 2015 at 5:40 PM, Niranda Perera nira...@wso2.com wrote:
Hi Maheshakya,
I've fixed this issue. Pls take an update of the carbon-analytics repo
rgds
On Mon, Jul 27, 2015 at 12:12 PM, Maheshakya Wijewardena
mahesha...@wso2.com wrote:
Hi
Hi Maheshakya,
I will look into this.
According to your setting, the ideal scenario is,
node1 - spark master (active) + worker
node2 - spark master (standby) + worker
node3 - worker
did you start the servers all at once or one by one?
rgds
On Mon, Jul 27, 2015 at 11:07 AM, Anjana Fernando
Hi Niranda,
I started those one by one.
On Mon, Jul 27, 2015 at 12:07 PM, Niranda Perera nira...@wso2.com wrote:
Hi Maheshakya,
I will look into this.
According to your setting, the ideal scenario is,
node1 - spark master (active) + worker
node2 - spark master (standby) + worker
node3
Hi,
Actually, when the 3'rd sever has started up, all 3 servers should have
worker instances. This seems to be a bug. @Niranda, please check it out
ASAP.
Cheers,
Anjana.
On Mon, Jul 27, 2015 at 11:01 AM, Maheshakya Wijewardena
mahesha...@wso2.com wrote:
Hi,
I have tried to create a Spark
Hi,
I have tried to create a Spark cluster with DAS using Carbon clustering. 3
DAS nodes are configured and number of Spark masters is set to 2. In this
setting, one of the 3 nodes should have a Spark worker node, but all 3
nodes are starting as Spark masters. What can be the reason for this?