[jira] [Updated] (SPARK-17452) Spark 2.0.0 is not supporting the "partition" keyword on a "describe" statement when using Hive Support

2016-09-21 Thread Josh Rosen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen updated SPARK-17452:
---
Component/s: (was: Build)
 SQL

> Spark 2.0.0 is not supporting the "partition" keyword on a "describe" 
> statement when using Hive Support
> ---
>
> Key: SPARK-17452
> URL: https://issues.apache.org/jira/browse/SPARK-17452
> Project: Spark
>  Issue Type: New Feature
>  Components: SQL
>Affects Versions: 2.0.0
> Environment: Amazon EMR 5.0.0
>Reporter: Hernan Vivani
>
> Changes introduced in Spark 2 are not supporting the "partition" keyword on a 
> "describe" statement.
> EMR 5 (Spark 2.0):
> ==
> scala> import org.apache.spark.sql.SparkSession
> scala> val 
> sess=SparkSession.builder().appName("test").enableHiveSupport().getOrCreate()
> scala> sess.sql("describe formatted page_view partition (dt='2008-06-08', 
> country='AR')").show 
> org.apache.spark.sql.catalyst.parser.ParseException:
> Unsupported SQL statement
> == SQL ==
> describe formatted page_view partition (dt='2008-06-08', country='AR')
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:58)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
>   at 
> org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
>   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
>   ... 48 elided
> Same statement is working fine on Spark 1.6.2 and Spark 1.5.2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17452) Spark 2.0.0 is not supporting the "partition" keyword on a "describe" statement when using Hive Support

2016-09-08 Thread Hernan Vivani (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hernan Vivani updated SPARK-17452:
--
Description: 
Changes introduced in Spark 2 are not supporting the "partition" keyword on a 
"describe" statement.

EMR 5 (Spark 2.0):
==
scala> import org.apache.spark.sql.SparkSession
scala> val 
sess=SparkSession.builder().appName("test").enableHiveSupport().getOrCreate()

scala> sess.sql("describe formatted page_view partition (dt='2008-06-08', 
country='AR')").show 
org.apache.spark.sql.catalyst.parser.ParseException:
Unsupported SQL statement
== SQL ==
describe formatted page_view partition (dt='2008-06-08', country='AR')
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:58)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
  ... 48 elided


Same statement is working fine on Spark 1.6.2 and Spark 1.5.2.

  was:
Changes introduced in Spark 2 are not supporting the "partition" keyword on a 
"describe" statement.

EMR 5 (Spark 2.0):
==
scala> import org.apache.spark.sql.SparkSession
scala> val 
sess=SparkSession.builder().appName("test").enableHiveSupport().getOrCreate()

sess.sql("describe formatted page_view partition (dt='2008-06-08', 
country='AR')").show 
org.apache.spark.sql.catalyst.parser.ParseException:
Unsupported SQL statement
== SQL ==
describe formatted page_view partition (dt='2008-06-08', country='AR')
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:58)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
  ... 48 elided


Same statement is working fine on Spark 1.6.2 and Spark 1.5.2.


> Spark 2.0.0 is not supporting the "partition" keyword on a "describe" 
> statement when using Hive Support
> ---
>
> Key: SPARK-17452
> URL: https://issues.apache.org/jira/browse/SPARK-17452
> Project: Spark
>  Issue Type: New Feature
>  Components: Build
>Affects Versions: 2.0.0
> Environment: Amazon EMR 5.0.0
>Reporter: Hernan Vivani
>
> Changes introduced in Spark 2 are not supporting the "partition" keyword on a 
> "describe" statement.
> EMR 5 (Spark 2.0):
> ==
> scala> import org.apache.spark.sql.SparkSession
> scala> val 
> sess=SparkSession.builder().appName("test").enableHiveSupport().getOrCreate()
> scala> sess.sql("describe formatted page_view partition (dt='2008-06-08', 
> country='AR')").show 
> org.apache.spark.sql.catalyst.parser.ParseException:
> Unsupported SQL statement
> == SQL ==
> describe formatted page_view partition (dt='2008-06-08', country='AR')
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:58)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
>   at 
> org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
>   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
>   ... 48 elided
> Same statement is working fine on Spark 1.6.2 and Spark 1.5.2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org