Hi Jakob, sorry for my late reply
I tried to run the below; came back with "netstat: lunt: unknown or
uninstrumented protocol
I also tried uninstalling version 1.6.0 and installing version1.5.2 with Java 7
and SCALA version 2.10.6; got the same error messages
Do you think it would be worth me
regarding my previous message, I forgot to mention to run netstat as
root (sudo netstat -plunt)
sorry for the noise
On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky wrote:
> Some more diagnostics/suggestions:
>
> 1) are other services listening to ports in the 4000 range (run
>
Some more diagnostics/suggestions:
1) are other services listening to ports in the 4000 range (run
"netstat -plunt")? Maybe there is an issue with the error message
itself.
2) are you sure the correct java version is used? java -version
3) can you revert all installation attempts you have done
If you type ‘whoami’ in the terminal, and it responds with ‘root’ then you’re
the superuser.
However, as mentioned below, I don’t think its a relevant factor.
> On Mar 10, 2016, at 12:02 PM, Aida Tefera wrote:
>
> Hi Tristan,
>
> I'm afraid I wouldn't know whether I'm
the public dns name of the master or workers
# Generic options for the daemons used in the standalone deploy mode
# - SPARK_CONF_DIR Alternate conf dir. (Default: ${SPARK_HOME}/conf)
# - SPARK_LOG_DIR Where log files are stored. (Default:
${SPARK_HOME}/logs)
# - SPARK_PID_DIR Wher
Hi Tristan,
I'm afraid I wouldn't know whether I'm running it as super user.
I have java version 1.8.0_73 and SCALA version 2.11.7
Sent from my iPhone
> On 9 Mar 2016, at 21:58, Tristan Nixon wrote:
>
> That’s very strange. I just un-set my SPARK_HOME env param,
It really shouldn’t, if anything, running as superuser should ALLOW you to bind
to ports 0, 1 etc.
It seems very strange that it should even be trying to bind to these ports -
maybe a JVM issue?
I wonder if the old Apple JVM implementations could have used some different
native libraries for
It should just work with these steps. You don't need to configure much. As
mentioned, some settings on your machine are overriding default spark
settings.
Even running as super-user should not be a problem. It works just fine as
super-user as well.
Can you tell us what version of Java you are
That’s very strange. I just un-set my SPARK_HOME env param, downloaded a fresh
1.6.0 tarball,
unzipped it to local dir (~/Downloads), and it ran just fine - the driver port
is some randomly generated large number.
So SPARK_HOME is definitely not needed to run this.
Aida, you are not running
Hi Jakob,
Tried running the command env|grep SPARK; nothing comes back
Tried env|grep Spark; which is the directory I created for Spark once I
downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
Tried running ./bin/spark-shell ; comes back with same error as below; i.e
could
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.
As Tristan mentioned, it looks as though Spark is trying to bind on
port 0 and then 1 (which is not allowed). Could it be that some
environment variables from you previous installation attempts are
polluting your configuration?
What does running "env | grep SPARK" show you?
Also, try running just
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>>>> at
>>>>>>>> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>>>> at $iwC$$iwC.(:15)
>>>>>>>
>> at .()
>>>>>>> at .(:7)
>>>>>>> at .()
>>>>>>> at $print()
>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>> at
>>>>>>> sun.reflect.NativeMeth
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>>> at
&g
.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>> at
>>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopI
he.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>> at
>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>&
ark$repl$SparkILoop$$process(SparkILoop.scala:945)
>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>> at org.apache.spark.repl.Main.main(Main.scala)
>&
tingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> at
> org.apache.spark.deploy.SparkSub
.. SKIPPED
>>>>>>> [INFO] Spark Project Streaming ... SKIPPED
>>>>>>> [INFO] Spark Project Catalyst SKIPPED
>>>>>>> [INFO] Spark Project SQL . SKIPPED
&
import sqlContext.implicits._
^
:16: error: not found: value sqlContext
import sqlContext.sql
^
scala>
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
Sent from the Apache Spark User
> On 8 Mar 2016, at 18:06, Aida wrote:
>
> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
I'd look at that error message and fix it
-
To unsubscribe, e-mail:
park Project Docker Integration Tests SKIPPED
>>>>>> [INFO] Spark Project REPL SKIPPED
>>>>>> [INFO] Spark Project Assembly
roject Tools ... SKIPPED
>>>>>> [INFO] Spark Project Hive SKIPPED
>>>>>> [INFO] Spark Project Docker Integration Tests SKIPPED
>>>>>> [INFO] Spark Project REPL
>>>>> [INFO] Spark Project External Twitter SKIPPED
>>>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>>>> [INFO] Spark Project External Flume .. SKIPPED
>>>>> [INFO] Spark Project Ext
.. SKIPPED
>>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>>> [INFO] Spark Project External Kafka .. SKIPPED
>>>> [INFO] Spark Project Examples SKIPPED
>>>> [I
-
>>> [INFO] Total time: 1.745s
>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>> [INFO] Final Memory: 19M/183M
>>> [INFO]
>>> ---
e goal
>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>> failed. Look above for specific messages explaining why the rule fa
ERROR] To see the full stack trace of the errors, re-run Maven with the -e
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please
> read the fo
-reader.git
/Users/aidatefera/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac
ion
ukdrfs01:spark-1.6.0 aidatefera$
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
Sent from the Apache Spark User List mailing list archive at Nabble.
Installing spark on mac is similar to how you install it on Linux.
I use mac and have written a blog on how to install spark here is the link
: http://vishnuviswanath.com/spark_start.html
Hope this helps.
On Fri, Mar 4, 2016 at 2:29 PM, Simon Hafner <reactorm...@gmail.com> wrote:
>
>
>Aida
>
>
>
>--
>View this message in context:
>http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
o I need to download
> anything else?
>
> I am very eager to learn more about Spark but am unsure about the best way
> to do it.
>
> I would be happy for any suggestions or ideas
>
> Many thanks,
>
> Aida
>
>
>
> --
> View this message in context:
>
:
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
35 matches
Mail list logo