It looks like it was a matter of where you call nosetests from, It had to be
run from within the src/spark folder since there were some other layers
above /src. But I've ran into another problem, the spark context I'm
creating is run under the default python interpreter instead of the one set
in spark/conf/spark-env.sh, so I think I should set it programmatically
within the setUpClass, Though I doubt this is as easy as setting an env.
variable, should I read the spark/conf/spark-env.sh inside the setUpClass?



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Automatic-testing-for-Spark-App-developed-in-Python-tp12099p12103.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to