[
https://issues.apache.org/jira/browse/SPARK-38808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Yuming Wang updated SPARK-38808:
--------------------------------
Target Version/s: (was: 3.2.1)
> Windows, Spark 3.2.1: spark-shell command throwing this error: SparkContext:
> Error initializing SparkContext
> ------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-38808
> URL: https://issues.apache.org/jira/browse/SPARK-38808
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 3.2.1
> Reporter: Taras
> Priority: Major
>
> Can't start spark-shell on Windows for Spark 3.2.1. Downgrading Spark version
> to 3.1.3 fixed the problem. I need Pandas API on Spark, which is available on
> >3.2, so it's not a solution for me.
> The bug and the workaround are described in the link below as well
> https://stackoverflow.com/questions/69923603/spark-shell-command-throwing-this-error-sparkcontext-error-initializing-sparkc
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]