th me trying to change the IP address in
SPARK_MASTER_IP to the IP address of the master node? If so, how would I go
about doing that?
Thanks,
Aida
Sent from my iPhone
> On 11 Mar 2016, at 08:37, Jakob Odersky wrote:
>
> regarding my previous message, I forgot to mention to
Hi Gaini, thanks for your response
Please see the below contents of the files in the conf. directory:
1. docker.properties.template
Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additiona
nloaded a
> fresh 1.6.0 tarball,
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run this.
>
> Aida, you are not running this as the super-user, are you? Wh
Hi Jakob,
Tried running the command env|grep SPARK; nothing comes back
Tried env|grep Spark; which is the directory I created for Spark once I
downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
Tried running ./bin/spark-shell ; comes back with same error as below; i.e
could
he current directory
> for the /conf dir.
>
> The defaults should be relatively safe, I’ve been using them with local mode
> on my Mac for a long while without any need to change them.
>
>> On Mar 9, 2016, at 2:20 PM, Aida Tefera wrote:
>>
>> I don't think
two that you should look at are
>> spark-defaults.conf
>> spark-env.sh
>>
>>
>>> On Mar 9, 2016, at 1:45 PM, Aida Tefera wrote:
>>>
>>> Hi Tristan, thanks for your message
>>>
>>> When I look at the spark-defaults.conf.templ
ould be
>> to run on a high-numbered port like 8080 or such.
>>
>> What do you have in your spark-env.sh?
>>
>>> On Mar 9, 2016, at 12:35 PM, Aida wrote:
>>>
>>> Hi everyone, thanks for all your support
>>>
>>> I went
messages etc. do I need to do anything else?
Thanks,
Aida
Sent from my iPhone
> On 8 Mar 2016, at 22:42, Jakob Odersky wrote:
>
> I've had some issues myself with the user-provided-Hadoop version.
> If you simply just want to get started, I would recommend downloading
> Spark
Hi everyone, thanks for all your support
I went with your suggestion Cody/Jakob and downloaded a pre-built version
with Hadoop this time and I think I am finally making some progress :)
ukdrfs01:spark-1.6.0-bin-hadoop2.6 aidatefera$ ./bin/spark-shell --master
local[2]
log4j:WARN No appenders cou
t happens when you do
>
> ./bin/spark-shell --master local[2]
>
> or
>
> ./bin/start-all.sh
>
>
>
>> On Tue, Mar 8, 2016 at 3:45 PM, Aida Tefera wrote:
>> Hi Cody, thanks for your reply
>>
>> I tried "sbt/sbt clean assembly" in the
Hi Cody, thanks for your reply
I tried "sbt/sbt clean assembly" in the Terminal; somehow I still end up with
errors.
I have looked at the below links, doesn't give much detail on how to install it
before executing "./sbin/start-master.sh"
Thanks,
Aida
Sent from my iP
tried sbt/sbt package; seemed to run fine until it didn't, was wondering
whether the below error has to do with my JVM version. Any thoughts? Thanks
ukdrfs01:~ aidatefera$ cd Spark
ukdrfs01:Spark aidatefera$ cd spark-1.6.0
ukdrfs01:spark-1.6.0 aidatefera$ sbt/sbt package
NOTE: The sbt/sbt script h
Hi all,
Thanks everyone for your responses; really appreciate it.
Eduardo - I tried your suggestions but ran into some issues, please see
below:
ukdrfs01:Spark aidatefera$ cd spark-1.6.0
ukdrfs01:spark-1.6.0 aidatefera$ build/mvn -DskipTests clean package
Using `mvn` from path: /usr/bin/mvn
Java
planning to at this stage.
I also downloaded Scala from the Scala website, do I need to download
anything else?
I am very eager to learn more about Spark but am unsure about the best way
to do it.
I would be happy for any suggestions or ideas
Many thanks,
Aida
--
View this message in context
14 matches
Mail list logo