[
https://issues.apache.org/jira/browse/SPARK-11148?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15029732#comment-15029732
]
Lunen commented on SPARK-11148:
-------------------------------
Hi Maciej,
Spark implementation at our client site will be in January 2016.
Kind regards
> Unable to create views
> ----------------------
>
> Key: SPARK-11148
> URL: https://issues.apache.org/jira/browse/SPARK-11148
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Environment: Ubuntu 14.04
> Spark-1.5.1-bin-hadoop2.6
> (I don't have Hadoop or Hive installed)
> Start spark-all.sh and thriftserver with mysql jar driver
> Reporter: Lunen
> Priority: Critical
>
> I am unable to create views within spark SQL.
> Creating tables without specifying the column names work. eg.
> CREATE TABLE trade2
> USING org.apache.spark.sql.jdbc
> OPTIONS (
> url "jdbc:mysql://192.168.30.191:3318/?user=root",
> dbtable "database.trade",
> driver "com.mysql.jdbc.Driver"
> );
> Ceating tables with datatypes gives an error:
> CREATE TABLE trade2(
> COL1 timestamp,
> COL2 STRING,
> COL3 STRING)
> USING org.apache.spark.sql.jdbc
> OPTIONS (
> url "jdbc:mysql://192.168.30.191:3318/?user=root",
> dbtable "database.trade",
> driver "com.mysql.jdbc.Driver"
> );
> Error: org.apache.spark.sql.AnalysisException:
> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not allow
> user-specified schemas.; SQLState: null ErrorCode: 0
> Trying to create a VIEW from the table that was created.(The select statement
> below returns data)
> CREATE VIEW viewtrade as Select Col1 from trade2;
> Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED:
> SemanticException [Error 10004]: Line 1:30 Invalid table alias or column
> reference 'Col1': (possible column names are: col)
> SQLState: null
> ErrorCode: 0
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]