According to Stack Overflow (https://stackoverflow.com/q/40786093) it should
be possible to write file to a local path and the result should be available
on the driver node.

However when I try this:

     df.write.parquet("file:///some/path")

the data seems to be written on each node, not a driver.

I checked an answer (https://stackoverflow.com/a/31240494) by Holden Karau
but it seems ambigous and other users
(https://stackoverflow.com/questions/31239161/save-a-spark-rdd-to-the-local-file-system-using-java#comment50482201_31240494)
seem to have similar problem to mine.



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to