Hi guys, when reading data from S3 from AWS using Spark 1.5.1 one of the
tasks hangs when reading data in a way that cannot be reproduced. Some times
it hangs, some times it doesn't.

This is the thread dump from the hung task:

"Executor task launch worker-3" daemon prio=10 tid=0x00007f419c023000
nid=0x6548 runnable [0x00007f425df2b000]
   java.lang.Thread.State: RUNNABLE
        at java.net.SocketInputStream.socketRead0(Native Method)
        at java.net.SocketInputStream.read(SocketInputStream.java:152)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at sun.security.ssl.InputRecord.readFully(InputRecord.java:442)
        at sun.security.ssl.InputRecord.readV3Record(InputRecord.java:554)
        at sun.security.ssl.InputRecord.read(InputRecord.java:509)
        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:934)
        - locked <0x00007f42c373b4d8> (a java.lang.Object)
        at
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1332)
        - locked <0x00007f42c373b610> (a java.lang.Object)
        at
sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1359)
        at
sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1343)
        at
org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:533)
        at
org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:401)
        at
org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
        at
org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
        at
org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:610)
        at
org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:445)
        at
org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
        at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
        at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
        at
com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:384)
        at
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:232)
        at
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3528)
        at
com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:976)
        at
com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:956)
        at
org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:892)
        at
org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:77)
        at org.apache.avro.mapred.FsInput.<init>(FsInput.java:37)
        at
org.apache.avro.mapreduce.AvroRecordReaderBase.createSeekableInput(AvroRecordReaderBase.java:171)
        at
org.apache.avro.mapreduce.AvroRecordReaderBase.initialize(AvroRecordReaderBase.java:87)
        at
org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:153)
        at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:124)
        at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:65)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)

This are my only manually passed settings (besides the s3 credentials):

--conf spark.driver.maxResultSize=4g \
--conf spark.akka.frameSize=500 \
--conf spark.hadoop.fs.s3a.connection.maximum=500 \

I'm using aws-java-sdk-1.7.4.jar and hadoop-aws-2.7.1.jar to be able to read
data from AWS.

I have been long time struggling with this issue, the only workaround I have
been able to find is to use Spark Speculation, however that's not a feasible
solution for me anymore.

Thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-task-hangs-infinitely-when-accessing-S3-from-AWS-tp25289.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to