Hi,
I am using spark 1.4.1 .
I am getting error when persisting spark dataframe output to hive

> scala>
> df.select("name","age").write().format("com.databricks.spark.csv").mode(SaveMode.Append).saveAsTable("PersonHiveTable");
> <console>:39: error: org.apache.spark.sql.DataFrameWriter does not take
> parameters
>
>

Can somebody points me whats wrong here ?

Would really appreciate your help.

Thanks in advance

Divya

Reply via email to