Hello Serega,

https://spark.apache.org/docs/latest/sql-programming-guide.html

Please try SaveMode.Append option. Does it work for you?


сб, 17 мар. 2018 г., 15:19 Serega Sheypak <serega.shey...@gmail.com>:

> Hi, I', using spark-sql to process my data and store result as parquet
> partitioned by several columns
>
> ds.write
>   .partitionBy("year", "month", "day", "hour", "workflowId")
>   .parquet("/here/is/my/dir")
>
>
> I want to run more jobs that will produce new partitions or add more files
> to existing partitions.
> What is the right way to do it?
>

Reply via email to