Github user dragos commented on the issue:
https://github.com/apache/spark/pull/17982
If I'm not mistaken the current patch will fail on files passed via `-i` to
spark-shell, since Spark is initialized after `process` is done (the
SparkContext is not available during initialization, when `-i` files are
loaded).
I'm not sure if there's a better way, especially one that works on all 2.11
series. Mea culpa for not adding a comment on `loadFiles` in the Scala REPL
when I worked on the Spark Shell. Maybe @som-snytt has a suggestion for
inserting `initializeSpark` in the initialization pipeline?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]