[ 
https://issues.apache.org/jira/browse/SPARK-34925?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17320317#comment-17320317
 ] 

Steve Loughran commented on SPARK-34925:
----------------------------------------

you've got an s3a config option set to 100M but the version of the s3a FS 
reading its configuration is expecting long and not parsing the "M" value. I'd 
suspect fs.s3a.block.size .

The fix would be: use a long value like 100000000, however, I'd warn that the 
move to taking 100M-like values in fs.s3a.block.size happened a long time 
ago...if that is the option which is causing this, it's a warning sign that 
some older version of hadoop-aws is on the classpath

> Spark shell failed to access external file to read content
> ----------------------------------------------------------
>
>                 Key: SPARK-34925
>                 URL: https://issues.apache.org/jira/browse/SPARK-34925
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 3.0.0
>            Reporter: czh
>            Priority: Major
>         Attachments: 微信截图_20210401093951.png
>
>
> Spark shell failed to access external file to read content
> Spark version 3.0, Hadoop version 3.2
> Start spark normally, enter spark shell and enter Val B1= sc.textFile ("S3a: 
> / / spark test / a.txt") is normal. When entering B1. Collect(). Foreach 
> (println) to traverse and read the contents of the file, an error is reported
> The error message is: java.lang.NumberFormatException : For input string: 
> "100M"



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to