Hello all,

This is probably me doing something obviously wrong, would really
appreciate some pointers on how to fix this.

I installed spark-1.3.1-bin-hadoop2.6.tgz from the Spark download page [
https://spark.apache.org/downloads.html] and just untarred it on a local
drive. I am on Mac OSX 10.9.5 and the JDK is 1.8.0_40.

I ran the following commands (the first 3 run succesfully, I mention it
here to rule out any possibility of it being an obviously bad install).

1) laptop$ bin/spark-shell

scala> sc.parallelize(1 to 100).count()

res0: Long = 100

scala> exit

2) laptop$ bin/pyspark

>>> sc.parallelize(range(100)).count()

100

>>> quit()

3) laptop$ bin/spark-submit examples/src/main/python/pi.py

Pi is roughly 3.142800

4) laptop$ bin/run-example SparkPi

This hangs at this line (full stack trace is provided at the end of this
mail)

15/05/23 07:52:10 INFO Executor: Fetching
http://10.0.0.5:51575/jars/spark-examples-1.3.1-hadoop2.6.0.jar with
timestamp 1432392670140

15/05/23 07:52:10 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)

java.net.SocketTimeoutException: connect timed out

...

and finally dies with this message:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 0.0 (TID 0, localhost):
java.net.SocketTimeoutException: connect timed out


I checked with ifconfig -a on my box, 10.0.0.5 is my IP address on my local
network.

en0: flags=8863<UP,BROADCAST,SMART,RUNNING,SIMPLEX,MULTICAST> mtu 1500

ether 34:36:3b:d2:b0:f4

inet 10.0.0.5 netmask 0xffffff00 broadcast 10.0.0.255

media: autoselect

 status: active


I think perhaps there may be some configuration I am missing. Being able to
run jobs locally (without HDFS or creating a cluster) is essential for
development, and the examples come from the Spark 1.3.1 Quick Start page [
https://spark.apache.org/docs/latest/quick-start.html], so this is probably
something to do with my environment.


Thanks in advance for any help you can provide.

-sujit

=====

Full output of SparkPi run (including stack trace) follows:

Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties

15/05/23 08:08:55 INFO SparkContext: Running Spark version 1.3.1

15/05/23 08:08:57 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

15/05/23 08:08:57 INFO SecurityManager: Changing view acls to: palsujit

15/05/23 08:08:57 INFO SecurityManager: Changing modify acls to: palsujit

15/05/23 08:08:57 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(palsujit);
users with modify permissions: Set(palsujit)

15/05/23 08:08:57 INFO Slf4jLogger: Slf4jLogger started

15/05/23 08:08:57 INFO Remoting: Starting remoting

15/05/23 08:08:58 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriver@10.0.0.5:52008]

15/05/23 08:08:58 INFO Utils: Successfully started service 'sparkDriver' on
port 52008.

15/05/23 08:08:58 INFO SparkEnv: Registering MapOutputTracker

15/05/23 08:08:58 INFO SparkEnv: Registering BlockManagerMaster

15/05/23 08:08:58 INFO DiskBlockManager: Created local directory at
/var/folders/z8/s_crq_2j2rqb9mv_4j8djsjnx359l2/T/spark-d97baddf-1b6f-41db-92bb-f82ab5184cb7/blockmgr-4ef3a194-1929-4dd3-a0e5-215175d8e41a

15/05/23 08:08:58 INFO MemoryStore: MemoryStore started with capacity 265.1
MB

15/05/23 08:08:58 INFO HttpFileServer: HTTP File server directory is
/var/folders/z8/s_crq_2j2rqb9mv_4j8djsjnx359l2/T/spark-fdf36480-def0-44b7-9942-098d9ef3e2b4/httpd-e494852a-7d61-4441-8b80-566d9f820afb

15/05/23 08:08:58 INFO HttpServer: Starting HTTP Server

15/05/23 08:08:58 INFO Server: jetty-8.y.z-SNAPSHOT

15/05/23 08:08:58 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:52009

15/05/23 08:08:58 INFO Utils: Successfully started service 'HTTP file
server' on port 52009.

15/05/23 08:08:58 INFO SparkEnv: Registering OutputCommitCoordinator

15/05/23 08:08:58 INFO Server: jetty-8.y.z-SNAPSHOT

15/05/23 08:08:58 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040

15/05/23 08:08:58 INFO Utils: Successfully started service 'SparkUI' on
port 4040.

15/05/23 08:08:58 INFO SparkUI: Started SparkUI at http://10.0.0.5:4040

15/05/23 08:08:58 INFO SparkContext: Added JAR
file:/Users/palsujit/Software/spark-1.3.1-bin-hadoop2.6/lib/spark-examples-1.3.1-hadoop2.6.0.jar
at http://10.0.0.5:52009/jars/spark-examples-1.3.1-hadoop2.6.0.jar with
timestamp 1432393738514

15/05/23 08:08:58 INFO Executor: Starting executor ID <driver> on host
localhost

15/05/23 08:08:58 INFO AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@10.0.0.5:52008/user/HeartbeatReceiver

15/05/23 08:08:58 INFO NettyBlockTransferService: Server created on 52010

15/05/23 08:08:58 INFO BlockManagerMaster: Trying to register BlockManager

15/05/23 08:08:58 INFO BlockManagerMasterActor: Registering block manager
localhost:52010 with 265.1 MB RAM, BlockManagerId(<driver>, localhost,
52010)

15/05/23 08:08:58 INFO BlockManagerMaster: Registered BlockManager

15/05/23 08:08:58 INFO SparkContext: Starting job: reduce at
SparkPi.scala:35

15/05/23 08:08:58 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:35)
with 2 output partitions (allowLocal=false)

15/05/23 08:08:58 INFO DAGScheduler: Final stage: Stage 0(reduce at
SparkPi.scala:35)

15/05/23 08:08:58 INFO DAGScheduler: Parents of final stage: List()

15/05/23 08:08:58 INFO DAGScheduler: Missing parents: List()

15/05/23 08:08:58 INFO DAGScheduler: Submitting Stage 0
(MapPartitionsRDD[1] at map at SparkPi.scala:31), which has no missing
parents

15/05/23 08:08:58 INFO MemoryStore: ensureFreeSpace(1848) called with
curMem=0, maxMem=278019440

15/05/23 08:08:58 INFO MemoryStore: Block broadcast_0 stored as values in
memory (estimated size 1848.0 B, free 265.1 MB)

15/05/23 08:08:58 INFO MemoryStore: ensureFreeSpace(1296) called with
curMem=1848, maxMem=278019440

15/05/23 08:08:58 INFO MemoryStore: Block broadcast_0_piece0 stored as
bytes in memory (estimated size 1296.0 B, free 265.1 MB)

15/05/23 08:08:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory
on localhost:52010 (size: 1296.0 B, free: 265.1 MB)

15/05/23 08:08:58 INFO BlockManagerMaster: Updated info of block
broadcast_0_piece0

15/05/23 08:08:58 INFO SparkContext: Created broadcast 0 from broadcast at
DAGScheduler.scala:839

15/05/23 08:08:58 INFO DAGScheduler: Submitting 2 missing tasks from Stage
0 (MapPartitionsRDD[1] at map at SparkPi.scala:31)

15/05/23 08:08:58 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks

15/05/23 08:08:58 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID
0, localhost, PROCESS_LOCAL, 1333 bytes)

15/05/23 08:08:58 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID
1, localhost, PROCESS_LOCAL, 1333 bytes)

15/05/23 08:08:58 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)

15/05/23 08:08:58 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)

15/05/23 08:08:58 INFO Executor: Fetching
http://10.0.0.5:52009/jars/spark-examples-1.3.1-hadoop2.6.0.jar with
timestamp 1432393738514

15/05/23 08:09:58 INFO Executor: Fetching
http://10.0.0.5:52009/jars/spark-examples-1.3.1-hadoop2.6.0.jar with
timestamp 1432393738514

15/05/23 08:09:58 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)

java.net.SocketTimeoutException: connect timed out

at java.net.PlainSocketImpl.socketConnect(Native Method)

at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345)

at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)

at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)

at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)

at java.net.Socket.connect(Socket.java:589)

at sun.net.NetworkClient.doConnect(NetworkClient.java:175)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)

at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)

at sun.net.www.http.HttpClient.New(HttpClient.java:308)

at sun.net.www.http.HttpClient.New(HttpClient.java:326)

at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1168)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1104)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:998)

at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:932)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:610)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:374)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:366)

at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org
$apache$spark$executor$Executor$$updateDependencies(Executor.scala:366)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

15/05/23 08:09:58 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1,
localhost): java.net.SocketTimeoutException: connect timed out

at java.net.PlainSocketImpl.socketConnect(Native Method)

at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345)

at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)

at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)

at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)

at java.net.Socket.connect(Socket.java:589)

at sun.net.NetworkClient.doConnect(NetworkClient.java:175)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)

at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)

at sun.net.www.http.HttpClient.New(HttpClient.java:308)

at sun.net.www.http.HttpClient.New(HttpClient.java:326)

at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1168)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1104)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:998)

at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:932)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:610)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:374)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:366)

at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org
$apache$spark$executor$Executor$$updateDependencies(Executor.scala:366)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)


15/05/23 08:09:58 ERROR TaskSetManager: Task 1 in stage 0.0 failed 1 times;
aborting job

15/05/23 08:09:58 INFO TaskSchedulerImpl: Cancelling stage 0

15/05/23 08:09:58 INFO Executor: Executor is trying to kill task 0.0 in
stage 0.0 (TID 0)

15/05/23 08:09:58 INFO TaskSchedulerImpl: Stage 0 was cancelled

15/05/23 08:09:58 INFO DAGScheduler: Stage 0 (reduce at SparkPi.scala:35)
failed in 60.065 s

15/05/23 08:09:58 INFO DAGScheduler: Job 0 failed: reduce at
SparkPi.scala:35, took 60.191508 s

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 1 in stage 0.0 failed 1 times, most recent failure:
Lost task 1.0 in stage 0.0 (TID 1, localhost):
java.net.SocketTimeoutException: connect timed out

at java.net.PlainSocketImpl.socketConnect(Native Method)

at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345)

at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)

at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)

at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)

at java.net.Socket.connect(Socket.java:589)

at sun.net.NetworkClient.doConnect(NetworkClient.java:175)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)

at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)

at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)

at sun.net.www.http.HttpClient.New(HttpClient.java:308)

at sun.net.www.http.HttpClient.New(HttpClient.java:326)

at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1168)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1104)

at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:998)

at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:932)

at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:610)

at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:374)

at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:366)

at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)

at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)

at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)

at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)

at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)

at org.apache.spark.executor.Executor.org
$apache$spark$executor$Executor$$updateDependencies(Executor.scala:366)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)


Driver stacktrace:

at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1204)

at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1193)

at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)

at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1192)

at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)

at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)

at scala.Option.foreach(Option.scala:236)

at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)

at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)

at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)

at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

Reply via email to