The issue is that you're using SQLContext instead of HiveContext. SQLContext 
implements a smaller subset of the SQL language and so you're getting a SQL 
parse error because it doesn't support the syntax you have. Look at how you'd 
write this in HiveQL, and then try doing that with HiveContext.

On Oct 7, 2014, at 7:20 AM, Gen <gen.tan...@gmail.com> wrote:

> Hi, in fact, the same problem happens when I try several joins together:
> 
> SELECT * 
> FROM sales INNER JOIN magasin ON sales.STO_KEY = magasin.STO_KEY 
> INNER JOIN eans ON (sales.BARC_KEY = eans.BARC_KEY and magasin.FORM_KEY =
> eans.FORM_KEY)
> 
> py4j.protocol.Py4JJavaError: An error occurred while calling o1229.sql.
> : java.lang.RuntimeException: [1.269] failure: ``UNION'' expected but
> `INNER' found
> 
> SELECT sales.Date AS Date, sales.ID_FOYER AS ID_FOYER, Sales.STO_KEY AS
> STO_KEY,sales.Quantite AS Quantite, sales.Prix AS Prix, sales.Total AS
> Total, magasin.FORM_KEY AS FORM_KEY, eans.UB_KEY AS UB_KEY FROM sales INNER
> JOIN magasin ON sales.STO_KEY = magasin.STO_KEY INNER JOIN eans ON
> (sales.BARC_KEY = eans.BARC_KEY and magasin.FORM_KEY = eans.FORM_KEY)
> 
>        at scala.sys.package$.error(package.scala:27)
>        at org.apache.spark.sql.catalyst.SqlParser.apply(SqlParser.scala:60)
>        at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:73)
>        at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:260)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:606)
>        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>        at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>        at py4j.Gateway.invoke(Gateway.java:259)
>        at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
>        at py4j.commands.CallCommand.execute(CallCommand.java:79)
>        at py4j.GatewayConnection.run(GatewayConnection.java:207)
>        at java.lang.Thread.run(Thread.java:745)
> 
> I use spark 1.1.0, so I have an impression that sparksql doesn't support
> several joins together. 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-more-than-two-tables-for-join-tp13865p15848.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to