No luck.
But two updates:
1. i have downloaded spark-1.4.1 and everything works fine, i dont see any
error
2. i have added the following XML file to spark's 1.5.2 conf directory and
now i got the following error
aused by: java.lang.RuntimeException: The root scratch dir:
c:/Users/marco/tmp on
hopping on a plane, but check the hive-site.xml that's in your spark/conf
directory (or should be, anyway). I believe you can change the root path thru
this mechanism.
if not, this should give you more info google on.
let me know as this comes up a fair amount.
> On Dec 19, 2015, at 4:58 PM,
Thanks Chris will give it a go and report back.
Bizarrely if I start the pyspark shell I don't see any issues
Kr
Marco
On 20 Dec 2015 5:02 pm, "Chris Fregly" wrote:
> hopping on a plane, but check the hive-site.xml that's in your spark/conf
> directory (or should be, anyway).
HI all
posting again this as i was experiencing this error also under 1.5.1
I am running spark 1.5.2 on a Windows 10 laptop (upgraded from Windows 8)
When i launch spark-shell i am getting this exception, presumably becaus ei
hav eno
admin right to /tmp directory on my latpop (windows 8-10 seems
Looks like it is this PR:
https://github.com/mesos/spark-ec2/pull/133
On Tue, Aug 25, 2015 at 9:52 AM, Shivaram Venkataraman
shiva...@eecs.berkeley.edu wrote:
Yeah thats a know issue and we have a PR out to fix it.
Shivaram
On Tue, Aug 25, 2015 at 7:39 AM, Garry Chen g...@cornell.edu
Corrected a typo in the subject of your email.
What you cited seems to be from worker node startup.
Was there other error you saw ?
Please list the command you used.
Cheers
On Tue, Aug 25, 2015 at 7:39 AM, Garry Chen g...@cornell.edu wrote:
Hi All,
I am trying to lunch a
Hi All,
I am trying to lunch a spark cluster on ec2 with spark 1.4.1
version. The script finished but getting error at the end as following. What
should I do to correct this issue. Thank you very much for your input.
Starting httpd: httpd: Syntax error on line 199 of
This thread seems related:
http://search-hadoop.com/m/JW1q51W02V
Cheers
On Wed, Apr 22, 2015 at 6:09 AM, James King jakwebin...@gmail.com wrote:
What's the best way to start-up a spark job as part of starting-up the
Spark cluster.
I have an single uber jar for my job and want to make
Cool. I was thinking of waiting a second and doing ps aux | grep java | grep
jarname.jar, and I guess checking 4040 would work as as well. Thanks for the
input.
Regards,Ashic.
Date: Sat, 24 Jan 2015 13:00:13 +0530
Subject: Re: Starting a spark streaming app in init.d
From: ak
Hello,
I'm trying to kick off a spark streaming job to a stand alone master using
spark submit inside of init.d. This is what I have:
DAEMON=spark-submit --class Streamer --executor-memory 500M
--total-executor-cores 4 /path/to/assembly.jar
start() {
$DAEMON -p
I'd do the same but put an extra condition to check whether the job has
successfully started or not by checking the application ui (port
availability 4040 would do, if you want more complex one then write a
parser for the same.) after putting the main script on sleep for some time
(say 2 minutes).
Hello,
I have enabled Spark in the Quickstart VM
and Running SparkPi in Standalone Mode
reference:
*http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH5/latest/CDH5-Installation-Guide/cdh5ig_running_spark_apps.html
If the question is likely about the Quickstart VM, it's better to ask
in the VM forum:
https://community.cloudera.com/t5/Apache-Hadoop-Concepts-and/bd-p/ApacheHadoopConcepts
Please give more detail though; it's not clear what you mean is not working.
On Sun, Aug 3, 2014 at 10:09 AM, Mahebub
Hello All,
I am new user of spark, I am using *cloudera-quickstart-vm-5.0.0-0-vmware*
for execute sample examples of Spark.
I am very sorry for silly and basic question.
I am not able to deploy and execute sample examples of spark.
please suggest me *how to start with spark*.
Please help me
Here's the complete overview http://spark.apache.org/docs/latest/
And Here's the quick start guidelines
http://spark.apache.org/docs/latest/quick-start.html
I would suggest you downloading the Spark pre-compiled binaries
First thing... Go into the Cloudera Manager and make sure that the Spark
service (master?) is started.
Marco
On Thu, Jul 24, 2014 at 7:53 AM, Sameer Sayyed sam.sayyed...@gmail.com
wrote:
Hello All,
I am new user of spark, I am using *cloudera-quickstart-vm-5.0.0-0-vmware*
for execute
Hi Sameer,
I think it is much easier to start using Spark in standalone mode on a single
machine. Last time I tried cloudera manager to deploy spark, it wasn't very
straight forward and I hit couple of obstacles along the way. However,
standalone mode is very easy to start exploring spark.
Hi Sam,
I tried Spark on Cloudera a couple month age, any there were a lot of issues…
Fortunately, I was able to switch to Hortonworks and exerting works perfect. In
general, you can try two mode: standalone and via YARN. Personally, I found
using Spark via YARN more comfortable special for
)
at
com.evocalize.rickshaw.spark.applications.GenerateSEOContent.main(GenerateSEOContent.java:82)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Issues-starting-up-Spark-on-mesos-akka-version-tp8505.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
19 matches
Mail list logo