Hi, If you have just one physical machine then I would try out Docker
instead of a full VM (would be waste of memory and CPU).

Best regards
Le 20 avr. 2015 00:11, "hnahak" <harihar1...@gmail.com> a écrit :

> Hi All,
>
> I've big physical machine with 16 CPUs , 256 GB RAM, 20 TB Hard disk. I
> just
> need to know what should be the best solution to make a spark cluster?
>
> If I need to process TBs of data then
> 1. Only one machine, which contain driver, executor, job tracker and task
> tracker everything.
> 2. create 4 VMs and each VM should consist 4 CPUs , 64 GB RAM
> 3. create 8 VMs and each VM should consist 2 CPUs , 32 GB RAM each
>
> please give me your views/suggestions
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-make-a-spark-cluster-tp22563.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to