Hi, 

I have been working with Spark for a few weeks. I do not yet understand how
I should organize my dev and production environment. 

Currently, I am using the IPython Notebook, I usually write tests scripts on
my mac with some very small data. Then when I am ready, I launch my script
on servers with data stored on Google Cloud Storage. I need each time to
export the script from the notebook to upload it to the server.

I feel that's very messy. I have now many python files on my computer and I
do not remember how each one work. 

What is your recommended spark workflow for your dev/prod environment?

Best,
poiuytrez



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-dev-environment-best-practices-tp16481.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to