Re: Spark 1.3 build with hive support fails

2015-03-30 Thread nightwolf
I am having the same problems. Did you find a fix? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-build-with-hive-support-fails-tp22215p22309.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: java.lang.IllegalStateException: unread block data while running the sampe WordCount program from Eclipse

2014-08-05 Thread nightwolf
Did you ever find a sln to this problem? I'm having similar issues. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-while-running-the-sampe-WordCount-program-from-Ecle-tp8388p11412.html Sent from the Apache

Re: Spark Deployment Patterns - Automated Deployment Performance Testing

2014-08-05 Thread nightwolf
Thanks AL! Thats what I though. I've setup nexus to maintain spark libs and download them when needed. For development purposes. Suppose we have a dev cluster. Is it possible to run the driver program locally (on a developers machine)? I..e just run the driver from the ID and have it

Running driver/SparkContent locally

2014-08-05 Thread nightwolf
I'm trying to run a local driver (on a development machine) and have this driver communicate with the Spark master and workers however I'm having a few problems getting the driver to connect and run a simple job from within an IDE. It all looks like it works but when I try to do something simple

Re: Running driver/SparkContent locally

2014-08-05 Thread nightwolf
The code for this example is very simple; object SparkMain extends App with Serializable { val conf = new SparkConf(false) //.setAppName(cc-test) //.setMaster(spark://hadoop-001:7077) //.setSparkHome(/tmp) .set(spark.driver.host, 192.168.23.108) .set(spark.cores.max, 10)

Spark Ooyala Job Server

2014-07-30 Thread nightwolf
Hi all, I'm trying to get the jobserver working with Spark 1.0.1. I've got it building, tests passing and it connects to my Spark master (e.g. spark://hadoop-001:7077). I can also pre-create contexts. These show up in the Spark master console i.e. on hadoop-001:8080 The problem is that after I