Anyone faced same kind of issue with Spark 2.0.1 ?

On Thu, Jul 20, 2017 at 2:08 PM, Chetan Khatri <chetan.opensou...@gmail.com>
wrote:

> Hello All,
> I am facing issue with storing Dataframe to Hive table with partitioning ,
> without partitioning it works good.
>
> *Spark 2.0.1*
>
> finalDF.write.mode(SaveMode.Overwrite).partitionBy("week_
> end_date").saveAsTable(OUTPUT_TABLE.get)
>
> and added below configuration too:
> spark.sqlContext.setConf("hive.exec.dynamic.partition", "true")
> spark.sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
>
> It is partitioning and storing as file in the output location, but while
> trying to see data it
> throws below error:
> java.io.IOException: java.io.FileNotFoundException: Requested file
> maprfs:/user/hive/warehouse/nextgen_rjr_accion.db/
> organization_item_masters_rjr_agg_day/week_end_date=2016-08-28/data does
> not exist.
>
> And columns are shown as “col(array)”
>
> Let me know any work around.
>
> Thanks
>

Reply via email to