I guess you're referring to the simple SQL dialect recognized by the SqlParser component.

Spark SQL supports most DDL and DML of Hive. But the simple SQL dialect is still very limited. Usually it's used together with some Spark application written in Java/Scala/Python. Within a Spark application, you can always register case class RDDs as temporary table, which partly replaces the functionality of DDL/DML in pure SQL scripts.

On the other hand, we do plan to support SQL 92 in the future.

On 10/16/14 10:50 PM, neeraj wrote:
Hi,

Does Spark SQL have DDL, DML commands to be executed directly. If yes,
please share the link.

If No, please help me understand why is it not there?

Regards,
Neeraj



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-DDL-DML-commands-tp16572.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to