Spark SQL can support SQL and HiveSQL which used SQLContext and HiveContext
separate.
As far as I know, SQLContext of Spark SQL 1.1.0 can not support three table
join directly.
However you can modify your query with subquery such as

SELECT * FROM (SELECT * FROM youhao_data left join youhao_age on
(youhao_data.rowkey=youhao_age.rowkey)) tmp left join youhao_totalKiloMeter
on (tmp.rowkey=youhao_totalKiloMeter.rowkey)

HiveContext of Spark 1.1.0 can support three table join.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql("SELECT * FROM youhao_data left join youhao_age on
(youhao_data.rowkey=youhao_age.rowkey) left join youhao_totalKiloMeter on
(youhao_age.rowkey=youhao_totalKiloMeter.rowkey)")

2014-09-15 10:41 GMT+08:00 boyingk...@163.com <boyingk...@163.com>:

>
> Hi:
> When I use spark SQL (1.0.1), I found it not support join between three
> tables,eg:
>  sql("SELECT * FROM youhao_data left join youhao_age on
> (youhao_data.rowkey=youhao_age.rowkey) left join youhao_totalKiloMeter on
> (youhao_age.rowkey=youhao_totalKiloMeter.rowkey)")
>  I take the Exception:
>  Exception in thread "main" java.lang.RuntimeException: [1.90] failure:
> ``UNION'' expected but `left' found
>
> If the Spark SQL 1.1.0 has support join between three tables?
>
> ------------------------------
>  boyingk...@163.com
>

Reply via email to