Frankly if you can give enough CPU performance to VM it should be good...
but for development setting up locally is better
1. debuggable in IDE
2. Faster
3. samples like run-example etc

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Wed, May 14, 2014 at 5:30 PM, Marco Shaw <marco.s...@gmail.com> wrote:

> Hi,
>
> I've wanted to play with Spark.  I wanted to fast track things and just
> use one of the vendor's "express VMs".  I've tried Cloudera CDH 5.0 and
> Hortonworks HDP 2.1.
>
> I've not written down all of my issues, but for certain, when I try to run
> spark-shell it doesn't work.  Cloudera seems to crash, and both complain
> when I try to use "SparkContext" in a simple Scala command.
>
> So, just a basic question on whether anyone has had success getting these
> express VMs to work properly with Spark *out of the box* (HDP does required
> you install Spark manually).
>
> I know Cloudera recommends 8GB of RAM, but I've been running it with 4GB.
>
> Could it be that 4GB is just not enough, and causing issues or have others
> had success using these Hadoop 2.x pre-built VMs with Spark 0.9.x?
>
> Marco
>

Reply via email to