I have same question . what if data is a round 12 GB

On Monday, April 4, 2016, Sai Dilip Reddy Kiralam <
[email protected]> wrote:

> That depends on the data amount your working with.
>
>
>
> *Best regards,*
>
> *K.Sai Dilip Reddy,*
>
>
> *Software Engineer - Hadoop Trainee,2-39, Old SBI Road, Sri Nagar
> Colony, Gannavaram - 521101.*
> *www.aadhya-analytics.com <http://www.aadhya-analytics.com/>.*
> *[image: Inline image 1]*
>
> On Sat, Mar 26, 2016 at 1:10 PM, sujitha chinnu <[email protected]
> <javascript:_e(%7B%7D,'cvml','[email protected]');>> wrote:
>
>> Hai All,
>>
>>   I'm collecting twitter data in my local machine in single node
>> (linux-ubuntu) cluster using storm topology. Now I want to keep this in
>> production by buying AWS servers. So I need suggestions on capacity
>> planning for setting up KAFKA-STORM cluster.
>>
>> Can anyone suggest me the memory utilizations for the following :
>>
>> 1. How much memory space should I allocate to ZOOKEEPER cluster ?
>>
>> 2. How much memory space should I allocate to SUPERVISOR & NIMBUS nodes ?
>>
>> 3. How much memory space should I allocate to KAFKA cluster ?
>>
>
>

Reply via email to