wesmcouch opened a new pull request, #45441: URL: https://github.com/apache/spark/pull/45441
### Why are the changes needed? * Running pyspark in an environment like AWS Lambda where /dev/fd is unavailable throws an error because /dev/fd does not exist ``` /var/lang/lib/python3.11/site-packages/pyspark/bin/spark-class: line 93: /dev/fd/62: No such file or directory /var/lang/lib/python3.11/site-packages/pyspark/bin/spark-class: line 97: CMD: bad array subscript ``` ### What changes were proposed in this pull request? Using a temporary file to run the command instead of process substitution allows the code to be ran in environments where /dev/fd does not exist. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? * Tested in AWS Lambda Python 3.11 Container based image using pyspark * Tested in Mac Python 3.11 environment using pyspark ### Was this patch authored or co-authored using generative AI tooling? No -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
