Not yet.  This is on the roadmap for Spark 1.4.

On Sun, Mar 22, 2015 at 12:19 AM, deenar.toraskar <deenar.toras...@db.com>
wrote:

> Hi
>
> I wanted to store DataFrames as partitioned Hive tables. Is there a way to
> do this via the saveAsTable call. The set of options does not seem to be
> documented.
>
> def
> saveAsTable(tableName: String, source: String, mode: SaveMode, options:
> Map[String, String]): Unit
> (Scala-specific) Creates a table from the the contents of this DataFrame
> based on a given data source, SaveMode specified by mode, and a set of
> options.
>
> Optionally is there a way to just create external hive tables for data that
> is already present on HDFS. something similar to
>
> sc.sql("alter table results add partition (date = '20141111');")
>
> Regards
> Deenar
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/DataFrame-saveAsTable-partitioned-tables-tp22173.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to