Hi Tristan, 

I'm afraid I wouldn't know whether I'm running it as super user. 

I have java version 1.8.0_73 and SCALA version 2.11.7

Sent from my iPhone

> On 9 Mar 2016, at 21:58, Tristan Nixon <st...@memeticlabs.org> wrote:
> 
> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
> fresh 1.6.0 tarball, 
> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
> port is some randomly generated large number.
> So SPARK_HOME is definitely not needed to run this.
> 
> Aida, you are not running this as the super-user, are you?  What versions of 
> Java & Scala do you have installed?
> 
>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>> 
>> Hi Jakob,
>> 
>> Tried running the command env|grep SPARK; nothing comes back 
>> 
>> Tried env|grep Spark; which is the directory I created for Spark once I 
>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>> 
>> Tried running ./bin/spark-shell ; comes back with same error as below; i.e 
>> could not bind to port 0 etc.
>> 
>> Sent from my iPhone
>> 
>>> On 9 Mar 2016, at 21:42, Jakob Odersky <ja...@odersky.com> wrote:
>>> 
>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>> port 0 and then 1 (which is not allowed). Could it be that some
>>> environment variables from you previous installation attempts are
>>> polluting your configuration?
>>> What does running "env | grep SPARK" show you?
>>> 
>>> Also, try running just "/bin/spark-shell" (without the --master
>>> argument), maybe your shell is doing some funky stuff with the
>>> brackets.
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to