[ 
https://issues.apache.org/jira/browse/SPARK-22526?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16261292#comment-16261292
 ] 

Steve Loughran commented on SPARK-22526:
----------------------------------------

S3a uses the AWS S3 client, which uses httpclient inside. S3AInputStream 
absolutely closes that stream on close(), it does a lot of work to use abort 
when it's considered more expensive to read to the end of the GET.  I wouldn't 
jump to blame the httpclient.

However, AWS S3 does http connection recycling pooling, set in 
{{fs.s3a.connection.maximum}}; default is 15. There might be some blocking 
waiting for free connections, so try a larger value. 

Now, what's needed to track things down? You get to do it yourself, at least 
for now, as you are the only person reporting this.

I'd go for
* getting the thread dump as things block & see what the threads are up to
* use "netstat -p tcp" to see what network connections are live
* logging {{org.apache.hadoop.fs.s3a}} and seeing what it says. 

Before doing any of that: move off 2.7 to using the Hadoop 2.8 binaries, Hadoop 
2.8 has a lot of performance and functionality improvements which will never be 
backported to 2.7.x, including updated AWS SDK libraries.

If you do find a problem in the s3a client/AWS SDK, the response to a HADOOP- 
JIRA will be "does it go away if you upgrade?". Save time by doing that before 
anything else.

> Spark hangs while reading binary files from S3
> ----------------------------------------------
>
>                 Key: SPARK-22526
>                 URL: https://issues.apache.org/jira/browse/SPARK-22526
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: mohamed imran
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Hi,
> I am using Spark 2.2.0(recent version) to read binary files from S3. I use 
> sc.binaryfiles to read the files.
> It is working fine until some 100 file read but later it get hangs 
> indefinitely from 5 up to 40 mins like Avro file read issue(it was fixed in 
> the later releases)
> I tried setting the fs.s3a.connection.maximum to some maximum values but 
> didn't help.
> And finally i ended up using the spark speculation parameter set which is 
> again didnt help much. 
> One thing Which I observed is that it is not closing the connection after 
> every read of binary files from the S3.
> example :- sc.binaryFiles("s3a://test/test123.zip")
> Please look into this major issue!      



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to