Thanks, Marcelo
> Instantiating SparkContext directly works. Well, sorta: it has > limitations. For example, see discussions about Spark not really liking > multiple contexts in the same JVM. It also does not work in "cluster" > deploy mode. > > That's fine - when one is doing something out of standard, one expects a bit of Caveat Emptor.