Hi everyone,

I have a spark application that works fine on a standalone Spark cluster
that runs on my laptop 
(master and one worker), but fails when I try to run in on a standalone
Spark cluster 
deployed on EC2 (master and worker are on different machines).
The application structure goes in the following way:
There is a java process ('message processor') that runs on the same machine
as 
Spark master.  When it starts, it submits itself to Spark master, then, 
it listens on SQS and on each received message, it should run a spark job to
process a file from S3, which address is configured in the message . 
It looks like all this fails at the point where the Spark driver tries to
send the job 
to the Spark executer.
Below is the code from the 'message processor' that configures the
SparkContext,
Then the Spark driver log, and then the Spark executor log.
The outputs of my code and some important points are marked in bold and 
I've simplified the code and logs in some places for the sake of
readability.
Would appreciate your help very much, because I've run out of ideas with
this problem.

'message processor' code:
===================================================================
===================================================================
||
logger.info("*Started Integration Hub SubmitDriver in test mode*.");

SparkConf sparkConf = new SparkConf()
.setMaster(SPARK_MASTER_URI)
.setAppName(APPLICATION_NAME)
.setSparkHome(SPARK_LOCATION_ON_EC2_MACHINE);
                        
sparkConf.setJars(JavaSparkContext.jarOfClass(this.getClass()));

// configure spark executor to use log4j properties located in the local
spark conf dir
sparkConf.set("spark.executor.extraJavaOptions", "-XX:+UseConcMarkSweepGC
-Dlog4j.configuration=log4j_integrationhub_sparkexecutor.properties");

sparkConf.set("spark.executor.memory", "1g");
sparkConf.set("spark.cores.max", "3");
// Spill shuffle to disk to avoid OutOfMemory, at cost of reduced
performance
sparkConf.set("spark.shuffle.spill", "true");
                        
logger.info("*Connecting Spark*");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
                        
sc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", AWS_KEY);
sc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey", AWS_SECRET);

logger.info("*Spark connected*");
||
======================================================================================================================================

Driver log:
======================================================================================================================================||
2015-05-01 07:47:14 INFO  ClassPathBeanDefinitionScanner:239 - JSR-330
'javax.inject.Named' annotation found and supported for component scanning
2015-05-01 07:47:14 INFO  AnnotationConfigApplicationContext:510 -
Refreshing
org.springframework.context.annotation.AnnotationConfigApplicationContext@5540b23b:
startup date [Fri May 01 07:47:14 UTC 2015]; root of context hierarchy
2015-05-01 07:47:14 INFO  AutowiredAnnotationBeanPostProcessor:140 - JSR-330
'javax.inject.Inject' annotation found and supported for autowiring
2015-05-01 07:47:14 INFO  DefaultListableBeanFactory:596 - Pre-instantiating
singletons in
org.springframework.beans.factory.support.DefaultListableBeanFactory@13f948e:
defining beans
[org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,integrationHubConfig,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor,processorInlineDriver,s3Accessor,cdFetchUtil,httpUtil,cdPushUtil,submitDriver,databaseLogger,connectorUtil,totangoDataValidations,environmentConfig,sesUtil,processorExecutor,processorDriver];
root of factory hierarchy
*2015-05-01 07:47:15 INFO  SubmitDriver:69 - Started Integration Hub
SubmitDriver in test mode.
2015-05-01 07:47:15 INFO  SubmitDriver:101 - Connecting Spark
*2015-05-01 07:47:15 INFO  SparkContext:59 - Running Spark version 1.3.0
2015-05-01 07:47:16 WARN  NativeCodeLoader:62 - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2015-05-01 07:47:16 INFO  SecurityManager:59 - Changing view acls to: hadoop
2015-05-01 07:47:16 INFO  SecurityManager:59 - Changing modify acls to:
hadoop
2015-05-01 07:47:16 INFO  SecurityManager:59 - SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(hadoop); users with modify permissions: Set(hadoop)
2015-05-01 07:47:18 INFO  Slf4jLogger:80 - Slf4jLogger started
2015-05-01 07:47:18 INFO  Remoting:74 - Starting remoting
2015-05-01 07:47:18 INFO  Remoting:74 - Remoting started; listening on
addresses :[akka.tcp://sparkDriver@sparkMasterIp:39176]
2015-05-01 07:47:18 INFO  Utils:59 - Successfully started service
'sparkDriver' on port 39176.
2015-05-01 07:47:18 INFO  SparkEnv:59 - Registering MapOutputTracker
2015-05-01 07:47:18 INFO  SparkEnv:59 - Registering BlockManagerMaster
2015-05-01 07:47:18 INFO  HttpFileServer:59 - HTTP File server directory is
/tmp/spark-e4726219-5708-48c9-8377-c103ad1e7a75/httpd-fe68500f-01b1-4241-a3a2-3b4cf8394daf
2015-05-01 07:47:18 INFO  HttpServer:59 - Starting HTTP Server
2015-05-01 07:47:19 INFO  Server:272 - jetty-8.y.z-SNAPSHOT
2015-05-01 07:47:19 INFO  AbstractConnector:338 - Started
SocketConnector@0.0.0.0:47166
2015-05-01 07:47:19 INFO  Utils:59 - Successfully started service 'HTTP file
server' on port 47166.
2015-05-01 07:47:19 INFO  SparkEnv:59 - Registering OutputCommitCoordinator
2015-05-01 07:47:24 INFO  Server:272 - jetty-8.y.z-SNAPSHOT
2015-05-01 07:47:24 INFO  AbstractConnector:338 - Started
SelectChannelConnector@0.0.0.0:4040
2015-05-01 07:47:24 INFO  Utils:59 - Successfully started service 'SparkUI'
on port 4040.
2015-05-01 07:47:24 INFO  SparkUI:59 - Started SparkUI at
http://sparkMasterIp:4040
2015-05-01 07:47:24 INFO  SparkContext:59 - Added JAR
/rev/8fcc3a5/integhub_be/genconn/lib/genconn-8fcc3a5.jar at
http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar with timestamp
1430466444838
2015-05-01 07:47:24 INFO  AppClient$ClientActor:59 - Connecting to master
akka.tcp://sparkMaster@sparkMasterIp:7077/user/Master...
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor added:
app-20150501074725-0005/0 on worker-20150430140019-ip-sparkWorkerIp-38610
(sparkWorkerIp:38610) with 1 cores
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor updated:
app-20150501074725-0005/0 is now LOADING
2015-05-01 07:47:25 INFO  AppClient$ClientActor:59 - Executor updated:
app-20150501074725-0005/0 is now RUNNING
2015-05-01 07:47:25 INFO  NettyBlockTransferService:59 - Server created on
34024
*2015-05-01 07:47:26 INFO  SubmitDriver:116 - Spark connected
2015-05-01 07:47:26 INFO  SubmitDriver:125 - Connected to SQS... Listening
on https://sqsAddress
2015-05-01 07:51:39 INFO  SubmitDriver:130 - Polling Message queue...
2015-05-01 07:51:47 INFO  SubmitDriver:148 - Received Message :
{someMessage}
2015-05-01 07:51:47 INFO  SubmitDriver:158 - Process Input JSON
*2015-05-01 07:51:50 INFO  SparkContext:59 - Created broadcast 0 from
textFile at ProcessorDriver.java:208
2015-05-01 07:51:52 INFO  FileInputFormat:253 - Total input paths to process
: 1
2015-05-01 07:51:52 INFO  SparkContext:59 - Starting job: first at
ConnectorUtil.java:605
2015-05-01 07:51:52 INFO  SparkContext:59 - Created broadcast 1 from
broadcast at DAGScheduler.scala:839
2015-05-01 07:51:52 WARN  TaskSetManager:71 - Lost task 0.0 in stage 0.0
(TID 0, sparkWorkerIp): java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 ERROR TaskSetManager:75 - Task 0 in stage 0.0 failed 4
times; aborting job
2015-05-01 07:51:52 ERROR ProcessorDriver:261 - Error executing the batch
Operation..
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0
(TID 3, sparkWorkerIp): java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1203)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1191)
        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1191)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
        at scala.Option.foreach(Option.scala:236)
        at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
||
======================================================================================================================================

Worker log:
======================================================================================================================================||
2015-05-01 07:47:26 INFO  CoarseGrainedExecutorBackend:47 - Registered
signal handlers for [TERM, HUP, INT]
2015-05-01 07:47:26 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:78)
        at 
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:43)
        at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)

2015-05-01 07:47:26 DEBUG Groups:139 -  Creating new Groups object
2015-05-01 07:47:27 DEBUG Groups:59 - Group mapping
impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping;
cacheTimeout=300000
2015-05-01 07:47:27 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
        at
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
        at 
org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
        at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209)
        at
org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:226)
        at 
org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:44)
        at
org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220)
        at 
org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:128)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:224)
        at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)

2015-05-01 07:47:27 DEBUG SparkHadoopUtil:63 - running as user: hadoop
2015-05-01 07:47:27 DEBUG UserGroupInformation:146 - hadoop login
2015-05-01 07:47:27 DEBUG UserGroupInformation:95 - hadoop login commit
2015-05-01 07:47:27 DEBUG UserGroupInformation:125 - using local
user:UnixPrincipal: root
2015-05-01 07:47:27 DEBUG UserGroupInformation:493 - UGI loginUser:root
2015-05-01 07:47:27 DEBUG UserGroupInformation:1143 - PriviledgedAction
as:hadoop
from:org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
2015-05-01 07:47:27 INFO  SecurityManager:59 - Changing view acls to:
root,hadoop
2015-05-01 07:47:27 INFO  SecurityManager:59 - Changing modify acls to:
root,hadoop
2015-05-01 07:47:27 INFO  SecurityManager:59 - SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(root, hadoop); users with modify permissions: Set(root, hadoop)
2015-05-01 07:47:27 DEBUG SecurityManager:63 - SSLConfiguration for file
server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None,
trustStore=None, trustStorePassword=None, protocol=None,
enabledAlgorithms=Set()}
2015-05-01 07:47:27 DEBUG SecurityManager:63 - SSLConfiguration for Akka:
SSLOptions{enabled=false, keyStore=None, keyStorePassword=None,
trustStore=None, trustStorePassword=None, protocol=None,
enabledAlgorithms=Set()}
2015-05-01 07:47:27 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie
is: off
2015-05-01 07:47:28 INFO  Slf4jLogger:80 - Slf4jLogger started
2015-05-01 07:47:28 INFO  Remoting:74 - Starting remoting
2015-05-01 07:47:29 INFO  Remoting:74 - Remoting started; listening on
addresses :[akka.tcp://driverPropsFetcher@sparkWorkerIp:49741]
2015-05-01 07:47:29 INFO  Utils:59 - Successfully started service
'driverPropsFetcher' on port 49741.
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 -
Shutting down remote daemon.
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 -
Remote daemon shut down; proceeding with flushing remote transports.
2015-05-01 07:47:29 INFO  SecurityManager:59 - Changing view acls to:
root,hadoop
2015-05-01 07:47:29 INFO  SecurityManager:59 - Changing modify acls to:
root,hadoop
2015-05-01 07:47:29 INFO  SecurityManager:59 - SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(root, hadoop); users with modify permissions: Set(root, hadoop)
2015-05-01 07:47:29 DEBUG SecurityManager:63 - SSLConfiguration for file
server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None,
trustStore=None, trustStorePassword=None, protocol=None,
enabledAlgorithms=Set()}
2015-05-01 07:47:29 DEBUG SecurityManager:63 - SSLConfiguration for Akka:
SSLOptions{enabled=false, keyStore=None, keyStorePassword=None,
trustStore=None, trustStorePassword=None, protocol=None,
enabledAlgorithms=Set()}
2015-05-01 07:47:29 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie
is: off
2015-05-01 07:47:29 INFO  RemoteActorRefProvider$RemotingTerminator:74 -
Remoting shut down.
2015-05-01 07:47:29 INFO  Slf4jLogger:80 - Slf4jLogger started
2015-05-01 07:47:29 INFO  Remoting:74 - Starting remoting
2015-05-01 07:47:29 INFO  Remoting:74 - Remoting started; listening on
addresses :[akka.tcp://sparkExecutor@ sparkWorkerIp:45299]
2015-05-01 07:47:29 INFO  Utils:59 - Successfully started service
'sparkExecutor' on port 45299.
2015-05-01 07:47:29 DEBUG SparkEnv:63 - Using serializer: class
org.apache.spark.serializer.JavaSerializer
2015-05-01 07:47:29 INFO  AkkaUtils:59 - Connecting to MapOutputTracker:
akka.tcp://sparkDriver@ sparkMasterIp:39176/user/MapOutputTracker
2015-05-01 07:47:30 INFO  AkkaUtils:59 - Connecting to BlockManagerMaster:
akka.tcp://sparkDriver@sparkMasterIp:39176/user/BlockManagerMaster
2015-05-01 07:47:30 INFO  DiskBlockManager:59 - Created local directory at
/mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-6f1a9653-86fd-401f-bf37-6eca5b6c0adf/blockmgr-ee0e9452-4111-42d0-ab5e-e66317052e4b
2015-05-01 07:47:30 INFO  MemoryStore:59 - MemoryStore started with capacity
548.5 MB
2015-05-01 07:47:30 INFO  AkkaUtils:59 - Connecting to
OutputCommitCoordinator: akka.tcp://sparkDriver@
sparkMasterIp:39176/user/OutputCommitCoordinator
2015-05-01 07:47:30 INFO  CoarseGrainedExecutorBackend:59 - Connecting to
driver: akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler
2015-05-01 07:47:30 INFO  WorkerWatcher:59 - Connecting to worker
akka.tcp://sparkWorker@sparkWorkerIp:38610/user/Worker
2015-05-01 07:47:30 DEBUG WorkerWatcher:50 - [actor] received message
Associated [akka.tcp://sparkExecutor@ sparkWorkerIp:45299] ->
[akka.tcp://sparkWorker@ sparkWorkerIp:38610] from
Actor[akka://sparkExecutor/deadLetters]
2015-05-01 07:47:30 INFO  WorkerWatcher:59 - Successfully connected to
akka.tcp://sparkWorker@ sparkWorkerIp:38610/user/Worker
2015-05-01 07:47:30 DEBUG WorkerWatcher:56 - [actor] handled message
(1.18794 ms) Associated [akka.tcp://sparkExecutor@ sparkWorkerIp:45299] ->
[akka.tcp://sparkWorker@ sparkWorkerIp:38610] from
Actor[akka://sparkExecutor/deadLetters]
2015-05-01 07:47:30 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received
message RegisteredExecutor from Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:47:30 INFO  CoarseGrainedExecutorBackend:59 - Successfully
registered with driver
2015-05-01 07:47:30 INFO  Executor:59 - Starting executor ID 0 on host
sparkWorkerIp
2015-05-01 07:47:30 DEBUG InternalLoggerFactory:71 - Using SLF4J as the
default logging framework
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - java.nio.Buffer.address:
available
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - sun.misc.Unsafe.theUnsafe:
available
2015-05-01 07:47:30 DEBUG PlatformDependent0:71 -
sun.misc.Unsafe.copyMemory: available
2015-05-01 07:47:30 DEBUG PlatformDependent0:76 - java.nio.Bits.unaligned:
true
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - UID: 0
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - Java version: 7
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noUnsafe: false
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - sun.misc.Unsafe: available
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noJavassist:
false
2015-05-01 07:47:30 DEBUG PlatformDependent:71 - Javassist: unavailable
2015-05-01 07:47:30 DEBUG PlatformDependent:71 - You don't have Javassist in
your class path or you don't have enough permission to load dynamically
generated classes.  Please check the configuration for better performance.
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.tmpdir: /tmp
(java.io.tmpdir)
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.bitMode: 64
(sun.arch.data.model)
2015-05-01 07:47:30 DEBUG PlatformDependent:76 - -Dio.netty.noPreferDirect:
false
2015-05-01 07:47:30 DEBUG MultithreadEventLoopGroup:76 -
-Dio.netty.eventLoopThreads: 2
2015-05-01 07:47:30 DEBUG NioEventLoop:76 - -Dio.netty.noKeySetOptimization:
false
2015-05-01 07:47:30 DEBUG NioEventLoop:76 -
-Dio.netty.selectorAutoRebuildThreshold: 512
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.numHeapArenas: 1
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.numDirectArenas: 1
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.pageSize: 8192
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.maxOrder: 11
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.chunkSize: 16777216
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.tinyCacheSize: 512
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.smallCacheSize: 256
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.normalCacheSize: 64
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.maxCachedBufferCapacity: 32768
2015-05-01 07:47:30 DEBUG PooledByteBufAllocator:76 -
-Dio.netty.allocator.cacheTrimInterval: 8192
2015-05-01 07:47:30 DEBUG ThreadLocalRandom:71 -
-Dio.netty.initialSeedUniquifier: 0x4ac460da6a283b82 (took 1 ms)
2015-05-01 07:47:31 DEBUG ByteBufUtil:76 - -Dio.netty.allocator.type:
unpooled
2015-05-01 07:47:31 DEBUG ByteBufUtil:76 -
-Dio.netty.threadLocalDirectBufferSize: 65536
2015-05-01 07:47:31 DEBUG NetUtil:86 - Loopback interface: lo (lo,
0:0:0:0:0:0:0:1%1)
2015-05-01 07:47:31 DEBUG NetUtil:81 - /proc/sys/net/core/somaxconn: 128
2015-05-01 07:47:31 DEBUG TransportServer:106 - Shuffle server started on
port :46839
2015-05-01 07:47:31 INFO  NettyBlockTransferService:59 - Server created on
46839
2015-05-01 07:47:31 INFO  BlockManagerMaster:59 - Trying to register
BlockManager
2015-05-01 07:47:31 INFO  BlockManagerMaster:59 - Registered BlockManager
2015-05-01 07:47:31 INFO  AkkaUtils:59 - Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@ sparkMasterIp:39176/user/HeartbeatReceiver
2015-05-01 07:47:31 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled
message (339.232401 ms) RegisteredExecutor from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received
message LaunchTask(org.apache.spark.util.SerializableBuffer@608752bf) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  CoarseGrainedExecutorBackend:59 - Got assigned
task 0
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled
message (22.96474 ms)
LaunchTask(org.apache.spark.util.SerializableBuffer@608752bf) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  Executor:59 - Running task 0.0 in stage 0.0 (TID
0)
2015-05-01 07:51:52 INFO  Executor:59 - Fetching
http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar with timestamp
1430466444838
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:78)
        at
org.apache.spark.executor.Executor.hadoopConf$lzycompute$1(Executor.scala:356)
        at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$hadoopConf$1(Executor.scala:356)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:375)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:366)
        at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
        at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:366)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 DEBUG Utils:63 - fetchFile not using security
2015-05-01 07:51:52 INFO  Utils:59 - Fetching
http://sparkMasterIp:47166/jars/genconn-8fcc3a5.jar to
/mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-0eabace1-ee89-48a3-9a71-0218f0ffc61c/fetchFileTemp2001054150131059247.tmp
2015-05-01 07:51:52 INFO  Utils:59 - Copying
/mnt/spark/spark-d745cbac-d1cc-47ee-9eba-e99e104732d5/spark-e3963fa3-cab6-4c69-8e78-d23246250a5d/spark-0eabace1-ee89-48a3-9a71-0218f0ffc61c/18615094621430466444838_cache
to /mnt/spark-work/app-20150501074725-0005/0/./genconn-8fcc3a5.jar
2015-05-01 07:51:52 INFO  Executor:59 - Adding
file:/mnt/spark-work/app-20150501074725-0005/0/./genconn-8fcc3a5.jar to
class loader
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:42)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 ERROR Executor:96 - Exception in task 0.0 in stage 0.0
(TID 0)
java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received
message LaunchTask(org.apache.spark.util.SerializableBuffer@6fc1ffd1) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  CoarseGrainedExecutorBackend:59 - Got assigned
task 1
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled
message (0.978784 ms)
LaunchTask(org.apache.spark.util.SerializableBuffer@6fc1ffd1) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  Executor:59 - Running task 0.1 in stage 0.0 (TID
1)
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:42)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 ERROR Executor:96 - Exception in task 0.1 in stage 0.0
(TID 1)
java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received
message LaunchTask(org.apache.spark.util.SerializableBuffer@404f8fa1) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  CoarseGrainedExecutorBackend:59 - Got assigned
task 2
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled
message (0.94322 ms)
LaunchTask(org.apache.spark.util.SerializableBuffer@404f8fa1) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  Executor:59 - Running task 0.2 in stage 0.0 (TID
2)
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:42)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 ERROR Executor:96 - Exception in task 0.2 in stage 0.0
(TID 2)
java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:50 - [actor] received
message LaunchTask(org.apache.spark.util.SerializableBuffer@70fab733) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 INFO  CoarseGrainedExecutorBackend:59 - Got assigned
task 3
2015-05-01 07:51:52 INFO  Executor:59 - Running task 0.3 in stage 0.0 (TID
3)
2015-05-01 07:51:52 DEBUG CoarseGrainedExecutorBackend:56 - [actor] handled
message (4.609909 ms)
LaunchTask(org.apache.spark.util.SerializableBuffer@70fab733) from
Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/user/CoarseGrainedScheduler#-970636338]
2015-05-01 07:51:52 DEBUG Configuration:227 - java.io.IOException: config()
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:227)
        at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:214)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:42)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

2015-05-01 07:51:52 ERROR Executor:96 - Exception in task 0.3 in stage 0.0
(TID 3)
java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1137)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2015-05-01 07:51:53 DEBUG BlockManagerSlaveActor:50 - [actor] received
message RemoveBroadcast(1,true) from Actor[akka.tcp://sparkDriver@
sparkMasterIp:39176/temp/$a]
2015-05-01 07:51:53 DEBUG BlockManagerSlaveActor:56 - [actor] handled
message (3.423332 ms) RemoveBroadcast(1,true) from
Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/temp/$a]
2015-05-01 07:51:53 DEBUG BlockManagerSlaveActor:63 - removing broadcast 1
2015-05-01 07:51:53 INFO  BlockManager:59 - Removing broadcast 1
2015-05-01 07:51:53 DEBUG BlockManagerSlaveActor:63 - Done removing
broadcast 1, response is 0
2015-05-01 07:51:53 DEBUG BlockManagerSlaveActor:63 - Sent response: 0 to
Actor[akka.tcp://sparkDriver@ sparkMasterIp:39176/temp/$a]
||
======================================================================================================================================



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-worker-error-on-standalone-cluster-tp22730.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to