Spark SQL just load the query result as a new source (via JDBC), so DO NOT 
confused with the Spark SQL tables. They are totally independent database 
systems.

From: Yi Zhang [mailto:zhangy...@yahoo.com.INVALID]
Sent: Friday, May 15, 2015 1:59 PM
To: Cheng, Hao; Dev
Subject: Re: Does Spark SQL (JDBC) support nest select with current version

@Hao,
Because the querying joined more than one table, if I register data frame as 
temp table, Spark can't disguise which table is correct. I don't how to set 
dbtable and register temp table.

Any suggestion?


On Friday, May 15, 2015 1:38 PM, "Cheng, Hao" 
<hao.ch...@intel.com<mailto:hao.ch...@intel.com>> wrote:

You need to register the “dataFrame” as a table first and then do queries on 
it? Do you mean that also failed?

From: Yi Zhang [mailto:zhangy...@yahoo.com.INVALID]
Sent: Friday, May 15, 2015 1:10 PM
To: Yi Zhang; Dev
Subject: Re: Does Spark SQL (JDBC) support nest select with current version

If I pass the whole statement as dbtable to sqlContext.load() method as below:

val query =
  """
    (select t1._salory as salory,
    |t1._name as employeeName,
    |(select _name from mock_locations t3 where t3._id = t1._location_id ) as 
locationName
    |from mock_employees t1
    |inner join mock_locations t2
    |on t1._location_id = t2._id
    |where t1._salory > t2._max_price) EMP
  """.stripMargin
val dataFrame = sqlContext.load("jdbc", Map(
  "url" -> url,
  "driver" -> "com.mysql.jdbc.Driver",
  "dbtable" -> query
))

It works. However, I can't invoke sql() method to solve this problem. And why?



On Friday, May 15, 2015 11:33 AM, Yi Zhang 
<zhangy...@yahoo.com.INVALID<mailto:zhangy...@yahoo.com.INVALID>> wrote:

The sql statement is like this:
select t1._salory as salory,
t1._name as employeeName,
(select _name from mock_locations t3 where t3._id = t1._location_id ) as 
locationName
from mock_employees t1
inner join mock_locations t2
on t1._location_id = t2._id
where t1._salory > t2._max_price

I noticed the issue [SPARK-4226] SparkSQL - Add support for subqueries in 
predicates - ASF JIRA<https://issues.apache.org/jira/browse/SPARK-4226> is 
still in the progress. And somebody commented it that Spark 1.3 would support 
it. So I don't know current status for this feature.  Thanks.

Regards,
Yi












[SPARK-4226] SparkSQL - Add support for subqueries in predicates - ASF 
JIRA<https://issues.apache.org/jira/browse/SPARK-4226>
java.lang.RuntimeException: Unsupported language features in query: select 
customerid from sparkbug where customerid in (select customerid from sparkbug 
where customerid in (2,3)) TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME sparkbug 
TOK_INSERT TOK_DE...


View on issues.apache.org<https://issues.apache.org/jira/browse/SPARK-4226>

Preview by Yahoo








Reply via email to