[ 
https://issues.apache.org/jira/browse/SPARK-22526?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16262489#comment-16262489
 ] 

mohamed imran commented on SPARK-22526:
---------------------------------------

[~ste...@apache.org] Thanks for your suggestions.

I did set the fs.s3a.connection.maximum value upto 10000 and checked the 
connections using netstat -a  | grep CLOSE_WAIT  while processing the files 
from S3. Every read of each files ,connection pool value gets increased but it 
never closed the connections.

Due to which at some point of time It gets hang indefinitely.

Anyhow as per your suggestion, I will upgrade my hadoop-2.7.3 version to 
hadoop-2.8 and take the stats! keep you posted.

> Spark hangs while reading binary files from S3
> ----------------------------------------------
>
>                 Key: SPARK-22526
>                 URL: https://issues.apache.org/jira/browse/SPARK-22526
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: mohamed imran
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Hi,
> I am using Spark 2.2.0(recent version) to read binary files from S3. I use 
> sc.binaryfiles to read the files.
> It is working fine until some 100 file read but later it get hangs 
> indefinitely from 5 up to 40 mins like Avro file read issue(it was fixed in 
> the later releases)
> I tried setting the fs.s3a.connection.maximum to some maximum values but 
> didn't help.
> And finally i ended up using the spark speculation parameter set which is 
> again didnt help much. 
> One thing Which I observed is that it is not closing the connection after 
> every read of binary files from the S3.
> example :- sc.binaryFiles("s3a://test/test123.zip")
> Please look into this major issue!      



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to